Bjarne Stroustrup: C++ | Artificial Intelligence (AI) Podcast

Lex Fridman  is an MIT professor that does Research and teach in human-centered AI, deep learning, autonomous vehicles & robotics. He participates in social media and hosts a podcast on Artificial Intelligence where he interviews people like Elon Musk.


I just saw on YouTube an interview to Bjarne Stroustrup. Stroustrup began developing C++ in 1979 (then called “C with Classes”), and, in his own words, “invented C++, wrote its early definitions, and produced its first implementation… chose and formulated the design criteria for C++, designed all its major facilities, and was responsible for the processing of extension proposals in the C++ standards committee.” Stroustrup also wrote a textbook for the language in 1985, The C++ Programming Language.


The thing I found most interesting was the assertion by Stroustrup that higher levels of abstraction (or maybe the right level?) produces more compact and efficient code. There are two things that need to be matched, on one side the idea of what needs to be done, that is in the mind of the programmer, on the other side is the machine instructions that are executed by the computer. The function of the programming language, or rather the compiler, is to allow the programmer to express his idea clearly, so that the optimizer will produce an executable that its reliable, efficient, and that corresponds to the idea of the programmer.


According to Stroustrup, reliability and efficiency are systems properties, and the way to achieve them is by simplification. The central idea is to have a flexible and effective type implementation that allows the programmer to match the types of his application to his needs without a performance penalty.

Life Cycle Analysis

Life Cycle Analysis (LCA) is hard to do. If you Google it, the most popular links are dated. LCA is a good idea that is difficult to manage because supply chains are complex and conflicting criteria must be applied to select the best option in each link of the chain.

The goal of LCA is to compare the full range of environmental effects assignable to products and services by quantifying all inputs and outputs of material flows and assessing how these material flows affect the environment. This information is used to improve processes, support policy and provide a sound basis for informed decisions.

The term life cycle refers to the notion that a fair, holistic assessment requires the assessment of raw-material production, manufacture, distribution, use and disposal including all intervening transportation steps necessary or caused by the product’s existence.

The procedures of life cycle assessment (LCA) are part of the ISO 14000 environmental management standards: in ISO 14040:2006 and 14044:2006. (ISO 14044 replaced earlier versions of ISO 14041 to ISO 14043.) GHG product life cycle assessments can also comply with specifications such as PAS 2050 and the GHG Protocol Life Cycle Accounting and Reporting Standard.[

To implement a sustainable supply chain companies must adopt a holistic systems-based approach in which all the supply chain partners’ activities are integrated throughout the four basic lifecycle stages: pre-manufacturing, manufacturing, use and post-use.

In 2009 Walmart announced its intention to create a new sustainable product index that would establish a single source of data for evaluating the sustainability of products. The company said it would provide initial funding for a consortium of universities, suppliers, retailers, government organizations and NGOs “to develop a global database of information on the lifecycle of products – from raw materials to disposal.”

LCA attempts replace monetary cost with an energy currency. Energy efficiency is only one consideration in deciding which alternative process to employ, and that it should not be elevated to the only criterion for determining environmental acceptability; for example, simple energy analysis does not take into account the renewability of energy flows or the toxicity of waste products; however, the life cycle assessment does help companies become more familiar with environmental properties and improve their environmental footprint.

The literature on life cycle assessment of energy technology has begun to reflect the interactions between the current electrical grid and future energy technology. Some papers have focused on energy life cycle, while others have focused on carbon dioxide (CO2) and other greenhouse gases. The essential critique given by these sources is that when considering energy technology, the growing nature of the power grid must be taken into consideration. If this is not done, a given class of energy technology may emit more CO2 over its lifetime than it mitigates.

A problem the energy analysis method cannot resolve is that different energy forms (heat, electricity, chemical energy etc.) have different quality and value even in natural sciences, as a consequence of the two main laws of thermodynamics. A thermodynamic measure of the quality of energy is exergy. According to the first law of thermodynamics, all energy inputs should be accounted with equal weight, whereas by the second law diverse energy forms should be accounted by different values.

A recent article by López  (López Isabel Noya, 2018) illustrates the complexities and methodologies of LCA.

References

Bredenberg, A. (2012, March 20). Lifecycle Assessment in Sustainable Supply Chains. Retrieved from ThomasNet.com: https://news.thomasnet.com/imt/2012/03/20/lifecycle-assessment-in-sustainable-supply-chains

Florian Suter, B. S. (2016, September 26). Life Cycle Impacts and Benefits of Wood along the Value Chain: The Case of Switzerland. Retrieved from Wiley Online Library: http://onlinelibrary.wiley.com/doi/10.1111/jiec.12486/full

Gestring, I. (2017). Life Cycle and Supply Chain Management for Sustainable Bins. Retrieved from Science Direct: https://doi.org/10.1016/j.proeng.2017.06.041

López Isabel Noya, V. V.-G. (2018, January). An environmental evaluation of food supply chain using life cycle assessment: A case study on gluten free biscuit products. Retrieved from Science Direct: https://www.sciencedirect.com/science/article/pii/S0959652617319777

United Nations Environment Programme. (2009). Life Cycle Management. United Nations Environment Programme. Dublin: United Nations Environment Programme. Retrieved from http://www.unep.fr/shared/publications/pdf/DTIx1208xPA-LifeCycleApproach-Howbusinessusesit.pdf

Wharton. (2010, March 3). The Business Case for Lifecycle Analysis and Building a Green Supply Chain. Retrieved from Knowledge@Wharton: http://knowledge.wharton.upenn.edu/article/the-business-case-for-lifecycle-analysis-and-building-a-green-supply-chain/

Wikipedia. (2018, January 22). Life-cycle assessment. Retrieved from Wikipedia: https://en.wikipedia.org/wiki/Life-cycle_assessment

 

 

grants

https://www.cfda.gov/index?cck=1&au=&ck=
https://ors.duke.edu/orsmanual/proposal-preparation-and-writing
http://fedworld.ntis.gov/
http://chfs.ky.gov/NR/rdonlyres/635F46A0-8EF6-4CE7-A6AE-B33D3DBE35A6/0/NNGCommonGrantApplication.pdf
http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg
http://grants.nih.gov/grants/grant_tips.htm
http://www.charityvillage.com
http://staff.lib.msu.edu/harris23/grants/privint.htm
http://coloradogrants.org/assets/pdf/lightsville-public-schools.pdf

The precedence diagram method

The precedence diagram method is a tool for scheduling activities in a project plan. It is a method of constructing a project schedule network diagram that uses boxes, referred to as nodes, to represent activities and connects them with arrows that show the dependencies.

  • Critical tasks, noncritical tasks, and slack time
  • Shows the relationship of the tasks to each other
  • Allows for what-if, worst-case, best-case and most likely scenario

Key elements include determining predecessors and defining attributes such as

  • early start date..
  • late start date
  • early finish date
  • late finish date
  • duration
  • WBS reference

The critical path method (CPM) is a project modeling technique developed in the late 1950s by Morgan R. Walker of DuPont and James E. Kelley Jr. of Remington Rand.[2] Kelley and Walker related their memories of the development of CPM in 1989.[3] Kelley attributed the term “critical path” to the developers of theProgram Evaluation and Review Technique which was developed at about the same time by Booz Allen Hamilton and the U.S. Navy.[4] The precursors of what came to be known as Critical Path were developed and put into practice by DuPont between 1940 and 1943 and contributed to the success of the Manhattan Project.[5]

CPM is commonly used with all forms of projects, including construction, aerospace and defense, software development, research projects, product development, engineering, and plant maintenance, among others. Any project with interdependent activities can apply this method of mathematical analysis. Although the original CPM program and approach is no longer used,[6] the term is generally applied to any approach used to analyze a project network logic diagram.

Originally, the critical path method considered only logical dependencies between terminal elements. Since then, it has been expanded to allow for the inclusion of resources related to each activity, through processes called activity-based resource assignments and resource leveling. A resource-leveled schedule may include delays due to resource bottlenecks (i.e., unavailability of a resource at the required time), and may cause a previously shorter path to become the longest or most “resource critical” path. A related concept is called the critical chain, which attempts to protect activity and project durations from unforeseen delays due to resource constraints.

Since project schedules change on a regular basis, CPM allows continuous monitoring of the schedule, which allows the project manager to track the critical activities, and alerts the project manager to the possibility that non-critical activities may be delayed beyond their total float, thus creating a new critical path and delaying project completion. In addition, the method can easily incorporate the concepts of stochastic predictions, using the program evaluation and review technique (PERT) and event chain methodology.

Currently, there are several software solutions available in industry that use the CPM method of scheduling; see list of project management software. The method currently used by most project management software is based on a manual calculation approach developed by Fondahl of Stanford University.

design of experiments

The design of experiments (DOE, DOX, or experimental design) is the design of any task that aims to describe or explain the variation of information under conditions that are hypothesized to reflect the variation. The term is generally associated with true experiments in which the design introduces conditions that directly affect the variation, but may also refer to the design of quasi-experiments, in whichnatural conditions that influence the variation are selected for observation.

In its simplest form, an experiment aims at predicting the outcome by introducing a change of the preconditions, which is reflected in a variable called the predictor. The change in the predictor is generally hypothesized to result in a change in the second variable, hence called the outcome variable. Experimental design involves not only the selection of suitable predictors and outcomes, but planning the delivery of the experiment under statistically optimal conditions given the constraints of available resources.

Main concerns in experimental design include the establishment of validity, reliability, and replicability. For example, these concerns can be partially addressed by carefully choosing the predictor, reducing the risk of measurement error, and ensuring that the documentation of the method is sufficiently detailed. Related concerns include achieving appropriate levels of statistical power and sensitivity.

Correctly designed experiments advance knowledge in the natural and social sciences and engineering. Other applications include marketing and policy making.

Design of Experiments (DOE)

Outline

  1. Introduction
  2. Preparation
  3. Components of Experimental Design
  4. Purpose of Experimentation
  5. Design Guidelines
  6. Design Process
  7. One Factor Experiments
  8. Multi-factor Experiments
  9. Taguchi Methods

In the design of experiments, optimal designs (or optimum designs[2]) are a class of experimental designs that are optimal with respect to some statistical criterion. The creation of this field of statistics has been credited to Danish statistician Kirstine Smith.[3][4]

In the design of experiments for estimating statistical models, optimal designs allow parameters to be estimated without bias and withminimum variance. A non-optimal design requires a greater number of experimental runs to estimate the parameters with the sameprecision as an optimal design. In practical terms, optimal experiments can reduce the costs of experimentation.

The optimality of a design depends on the statistical model and is assessed with respect to a statistical criterion, which is related to the variance-matrix of the estimator. Specifying an appropriate model and specifying a suitable criterion function both require understanding ofstatistical theory and practical knowledge with designing experiments.

DMAIC

DMAIC (an acronym for Define, Measure, Analyze, Improve and Control) (pronounced də-MAY-ick) refers to a data-driven improvement cycle used for improving, optimizing and stabilizing business processes and designs. The DMAIC improvement cycle is the core tool used to drive Six Sigma projects. However, DMAIC is not exclusive to Six Sigma and can be used as the framework for other improvement applications.