Thursday, January 28, 2016

Lunch Talk: In 2003 Elon Musk gave a talk at Stanford about PayPal and Space X

From the Youtube blurb:

"Elon Musk, co-founder, CEO, and chairman of PayPal, shares his background: He was accepted into Stanford but deferred his admission to start an internet company in 1995. His company was zip2 which helped the media industry convert their content to electronic medium. Then, he sold the company for over $300 million and never came back to Stanford."

tags: youtube, lunchtalk, innovation, media, space

Scalability: from Neanderthals to Twitter

A quote from "Sapiens: A Brief History of Humankind",


Twitter is having trouble competing for users against Facebook and Youtube because it has failed to scale human relationships beyond the threshold of 150 individuals. That is, the social networking niche of "less than 150" is already occupied by Facebook and for Twitter to become successful, the company has to make it easy for each user to organize and curate information dynamically from thousands of people who are not in the immediate network. Moreover, since connections and information on Twitter is more (10X!) dynamic than on Facebook, the degree of organization of information streams has to be at least 10X more sophisticated as well.

Youtube has met its content scalability challenge by enabling users to create and share playlists, channels, and subscriptions. Every user on Youtube is a developer who produces new ways to access contents at a collection or stream level, rather than at single video level. In Scalable Innovation we call it scaling at the aboutness" layer. So far, Twitter can't find a way to enable its users to become developers. All they can do is propagate gossip, which worsens the information overload problem for everybody who gets over the "150 individuals" threshold.

To summarize, Twitter needs to find a way to help people become better Information Sapiens because the Information Neanderthal niche is already occupied by Facebook and Youtube.

tags:scale, innovation, control, aboutness, twitter, social

Wednesday, January 27, 2016

Stanford CSP. Scalable Innovation (BUS 134) Session 2 Quiz 1

Go is a board game invented 2,500 years ago in China. According to a recent MIT Technology Review (MTR) article, "Mastering Go ... requires endless practice, as well as a finely tuned knack of recognizing subtle patterns in the arrangement of the pieces spread across the board."

Experts have long considered Go as one of the most complex and intuitive human games ever created, much more complex than, e.g. chess or poker. Nevertheless, Google AI researchers have developed a software that "beat the European Go champion, Fan Hui, five games to zero. And this March it will take on one of the world’s best players, Lee Sedol, in a tournament to be held in Seoul, South Korea."


Read the MTR article mentioned above and consider/answer the following questions:

1. Does Alpha Go represent a major technology innovation? Explain your reasoning.

2. If combining two or more deep learning networks, as described in the article, is the wave of the future, what industries, new or existing, would benefit from the technology the most? Why?

3. Using the System Model (Scalable Innovation, Part I), hypothesize what system elements and interfaces still need to be invented to complement or take advantage of Alpha Go-like software.

tags: innovation, course, stanford, quiz

Monday, January 18, 2016

Pragmatic creativity among Chimps, Orangutans, and Bonobos

Unlike us humans, who are still confused about what "healthy" food is, many primates know that "healthy" means developing a habit that separates nutritious food from harmful food. For example, chimpanzees, orangutans, and bonobos, know how to turn dirty apples into clean ones, while gorilla's don't.

Chimps, Orangutans, and Bonobos are pragmatically creative because they've developed a consistent process for dramatically improving health outcomes of a recurring situation.

Source: Matthias Allritz, Claudio Tennie, Josep Call. Food washing and placer mining in captive great apes, 2012. DOI 10.1007/s10329-013-0355-5.

Friday, January 15, 2016

Lunch Talk: (Authors at Google) How New Ideas Emerge

Matt Ridley’s brilliant and ambitious new book in which he explores his considered belief that evolution—in biology, business, technology, and nearly every area of human culture—trumps deliberate and intelligent design.


tags: lunchtalk, creativity, innovation, evolution, scale

Thursday, January 14, 2016

Stanford CSP, Scalable Innovation (BUS 134) - Session 1, Quiz 1

1. Identify at least three trends mentioned in this Bitcoin-related NYT article: A Bitcoin Believer's Crisis of Faith


2. Trends: headwinds and tailwinds
2.1. Name at least one trend that significantly increases Bitcoin's chances for success.
2.2. Name at least one trend that significantly decreases Bitcoin's chances for success.

3. Name major technology innovations that power trends positive for Bitcoin.

Examples of trend categories:

- Business
- Technology
- Science
- Finance
- Demographics
- Social
- Market
- Regulatory
etc...

tags: stanford, bus134, quiz, innovation, trends, bitcoin

Tuesday, January 12, 2016

Lunch Talk: (Authors at Google) Learning how to learn math and sciece

In A Mind for Numbers, Dr. Oakley lets us in on the secrets to effectively learning math and science—secrets that even dedicated and successful students wish they’d known earlier. Contrary to popular belief, math requires creative, as well as analytical, thinking. Most people think that there’s only one way to do a problem, when in actuality, there are often a number of different solutions—you just need the creativity to see them. For example, there are more than three hundred different known proofs of the Pythagorean Theorem. In short, studying a problem in a laser-focused way until you reach a solution is not an effective way to learn math. Rather, it involves taking the time to step away from a problem and allow the more relaxed and creative part of the brain to take over.

The creative aspect of learning math and science is somewhat similar to elements of creativity necessary for developing user scenarios in hard-core technology solutions.

Sunday, January 10, 2016

The paradox of "healthy food"

The "Lunch Talk" video I posted earlier today defies a popular misconception that healthy food is expensive. The healthy food confusion is a version of a common human perception that expensive things or experiences are inherently better than inexpensive ones. For example, in experiments with differently labeled wines people report "expensive" as being of a higher quality. In experiments with painkillers, people report that large, colorful, "expensive" pills work better than plain pills. The trade-off between quality and price seems to be fundamental to our understanding of how things work in the world.


Remarkably, there's nothing fundamental neither in nature, nor technology that determines good stuff should cost more than bad stuff. Moreover, major business breakthroughs happen when inventors deliver high quality products and services at dramatically lower prices. For example, Henry Ford created a technology revolution when he introduced Ford-T and the assembly line to manufacture the most reliable and most affordable automobile in history. Before him, people believed that reliable automobiles must be expensive. Similarly, Amazon introduced a business model where a company can inexpensively provide a great shopping experience with lots of choices, knowledgeable explanations, quality ratings and fast convenient delivery. Before Amazon, retailers believed that high quality shopper experience was only possible in high-end stores managed by highly compensated staff. They were proven wrong with dire consequences for their shareholders.

Today, businesses like Whole Foods and Sprouts are built on the assumption that healthy food must be expensive. Leanne Brown's book shows that this trade-off can be broken. As a result, we might see a revolution in many health-related areas, from retail food outlets to obesity prevention apps to government welfare services.

tags: health, trade-off, quality, innovation

Lunch Talk: (Authors at Google - Leanne Brown) Eat Well on $4/Day


Good and Cheap is an NYT-bestselling cookbook [by Leanne Brown] for people with very tight budgets, particularly those on SNAP/Food Stamp benefits. The free PDF has been downloaded more than 800,000 times, and a Kickstarter campaign for an initial print run brought in over $144,000 (it remains the #1 cookbook ever on Kickstarter).

lunchtalk, health, culture

Thursday, January 07, 2016

Will Samsung write you a prescription and deliver your medicine?

At CES 2016 Samsung showed a number of wellness-related products, including the WELT:
The WELT communicates with your phone to tell you how many steps you've taken, how long you've been sitting, eating habits and your waistline size. It then sends the data to a specially-designed app for analysis, to tell you things like -- if you keep eating like you did today, you're going to gain 2 pounds this month. Samsung expects the WELT to go on sale this year.
If the product becomes a commercial success, it's easy to imagine how much historical data the company is going to collect across a broad range of demographic categories. Even if this particular product flops in the market, similar ones, e.g. made by FitBit or Apple, will emerge over time. The key difference between Samsung and others is that Samsung is now getting into pharmaceuticals. Here's a quote from a 2014 Bloomberg article:
South Korea’s biggest company is investing at least $2 billion in biopharmaceuticals, including the growing segment of biosimilars, which are cheaper versions of brand-name biotechnology drugs that have lost patent protection.

“We are in an infancy still,” Christopher Hansung Ko, chief executive officer at the Samsung Bioepis unit, said in an interview. “We are a Samsung company. Our mandate is to become No. 1 in everything we enter into, so our long-term goal is to become a leading pharmaceutical company in the world.”

Remarkably, Samsung has a chance to become the only company in the world capable of gathering real-time biological data, diagnosing diseases and delivering appropriate treatments to an individual at the right time, in the right place and at the right price.

tags: innovation, samsung, health, detection, tool, mobile

Wednesday, January 06, 2016

3D Printing - the new Clay Age

Consider a recent MIT Review article about the latest 3D printing lab experiments. What is their importance to inventors and what can we use to predict evolution of this technology?


When we study history, especially, history of innovation, people conventionally mention the Stone Age, the Bronze Age, the Iron Age, etc. At the core of such descriptions lies a wonder material — stone, bronze, iron, steel, silicon — something that enables a huge range of applications, which power technology developments for decades or even hundreds and thousands of years.

Paradoxically, there's no Clay Age (see fig below).

This is really unfortunate because the clay turned out to be the ultimate material that served us, humans, for thousands of years and enabled us to produce an amazing range of objects and technologies: from bricks to construction and architecture, from jars to storage and shipping, from ceramics to chemistry and modern waterworks, from concrete to skyscrapers and highway transportation systems. From an inventor's perspective, I see clay-based technologies as the first example of what we call today additive manufacturing.

Let's go back few thousands of years and compare stone (Before) and clay (After) as manufacturing materials. If you live in a cave and use stone to make your tools you have to chip away, blow-by-blow, certain parts of the original piece of rock that don't fit your design.




Even when we consider "raw" rocks being cheap and disregard the waste of material itself, our ability to shape the rock or change its internal physical structure is severely limited by what we can find in nature. By contrast, clay is extremely malleable: you can shape it, add filaments, make it hollow, make it solid, make it hard, glaze it, and much more. If you are a hunter-gatherer, by combining clay and fire you can create all kinds of sharp weapons that your stone age competition can't even imagine. If you are a gatherer, you can create jars and jugs, using one of the cornerstone inventions of human civilization: the Potter's Wheel.


If you are a house builder, even a primitive one, you can use mud bricks and reinforce them with straw. As you master fire and masonry, you learn how to create bricks and construct buildings that last decades and centuries, instead of years. You can even print money tokens with appropriate clay technologies! Furthermore, with advanced firing techniques, you discover how to melt and shape metals and discover important alloys, such as Bronze. Ultimately, you develop communities of innovation and economies of scale unheard of in the Stone Age.

Why thinking about the Clay Age is important today, when we are well beyond using mud for building cities? The main goal is to gain an insight into what additive manufacturing can do for us for years to come. Just like clay, 3D printing represents a technology approach with a promising long-term potential. That is, when working with both, clay and 3D printing, instead of removing and wasting extra, we add materials and shape surfaces to achieve desired designs. Luckily, for 3D printing we can leverage the learnings from clay.

Over the thousands of years, humans learned to work with clay by combining 6 key modifying methods:
1. Shape - change the outer geometry (e.g. brick).
2. Thin or thicken - change the inner geometry (e.g. thin jar).
3. Fill - change the inner structure (e.g. reinforced concrete)
4. Fire - modify inner and/or outer hardness or other material properties (e.g. hardened stove brick)
5. Slip - modify or create an outer layer with specific properties (e.g. ceramic glazes)
6. Decorate - paint or other exterior designs to make things aesthetically appealing.

With 3D printing we are still working on items 1 and 2, barely touching 3. Some of the research labs approach item 4 on our list - firing, or its equivalents.  For example, the MIT article that I've mentioned in the beginning of the post uses the ancient sequence of a clay-based technology: shape your piece from a soft material with special additives, then fire in the kiln, to achieve desired hardness and durability. Remarkably, modern 3D printing combines the ancient material — ceramics — with modern design techniques — computer modeling and manufacturing.

In the short term, 3D printing went through a lot of hype that fizzled a bit by now. In the long term,  the age of 3D printing, just like the Clay Age, is going to create a strong foundation for a broad range of human technologies. Basically, we are in the hunter-gatherer stage of our 3D evolution curve.

tags: technology, innovation, history, invention, creativity

Lunch Talk: Nanotechnology at work


A 2015 Nova documentary shows science and tech advances that power applications of nanotechnology in electronics, healthcare, optics, energy, and other fields.

tags: lunchtalk, technology, materials

Tuesday, January 05, 2016

Life sciences vs Computer Sciences - a challenge for the 21st century

Investor Peter Thiel captures the core difference between bio and computer tech in his recent interview to MTR:
This goes back to that famous Bill Gates line, where he said he liked programming computers as a kid because they always did what he told them to. They would never do anything different. A big difference between biology and software is that software does what it is told, and biology doesn’t.

One of the challenges with biotechnology generally is that biology feels too complicated and too random. It feels like there are too many things that can go wrong. You do this one little experiment and you can get a good result. But then there are five other contingencies that have to work the right way as well. I think that creates a world where the researchers, the scientists, and the entrepreneurs that start companies don’t really feel that they have agency.
Unlike computer science, biology doesn't have the equivalent of the Church-Turing thesis that, essentially, guarantees an implementability of a valid algorithm. The success of Silicon Valley is built on top this important discovery of the 20th century. That is, once a "computation" entrepreneur, either in software or hardware, finds a way to express his useful idea in an algorithmic way, he or she can be sure that it will work, provided the computational power, storage, and networking capacity grow exponentially. Most famously, Larry Ellison created his Relational Database business in mid-1970s when people did not understand implications of the Moore's Law yet.



Biology is different. Vernon Vinge, a science fiction writer, aptly calls our future successes in medicine "A Minefield Made in Heaven" because it's hard to predict the specific locations of magical "mines" that we are going to discover and cure various diseases.

Peter Thiel uses word "random" to describe biology; but from a practical perspective it's actually worse than that. If it were random we could use known randomization techniques from computer science and make new biological discoveries by almost brute force. We can't. Therefore, I'd rather use a different term – arbitrary, and there's no algorithm for generating useful arbitrariness yet - only human ingenuity.

The good news is that some of the life sciences fields are compatible with computation. We are going to make a lot of progress in areas where we can hook up analog biological experiments to the exponentially growing computing platforms. Diagnostics and pattern matching for known problems seem to be the most promising field.

tags: biology, innovation, science, technology, silicon valley

Monday, January 04, 2016

Lunch Talk: (at Stanford) What they don't teach you about entrepreneurship



Part of 2010 Conference on Entrepreneurship at Stanford Graduate School of Business.

Description: A group of entrepreneurs talk about what they learned in the trenches that they never could have learned in a classroom. The panelists will also share the courses that were most helpful to them in their entrepreneurial ventures, the courses that they wished they had taken, and the topics that business schools should be teaching to aspiring entrepreneurs.

Sunday, January 03, 2016

Discipline and Punish, 21st century style

My morning twitter feed brought together two seemingly unrelated articles:

1. The MIT Review: overview of Robotics Trends for 2016.
2. The Economist Economist: article on the disappearance of middle managers in 2016 and beyond.


To get an insight into long-term implications of the trends, first consider a quote from each one of them separately:

The Economist,
Existing systems will be replaced by new ones built on more fashionable qualities: speed and transparency. Companies will stop fussing about inputs (how people do things) and focus only on outputs (what they produce). They will be obsessed with data, losing all interest in anything that can’t be measured. Every employee will be monitored every second; every keystroke and click will be tracked and analysed. Some companies will go further and get white-collar workers to wear sensors that track all movements and measure their tone of voice and the number of steps they take.

The MIT Review,
Another trend to look out for this year is robots sharing the knowledge they have acquired with other robots. This could accelerate the learning process, instantly allowing a robot to benefit from the efforts of others (see “Robots Quickly Teach Each Other to Grasp New Objects”). What’s more, thanks to clever approaches for adapting information to different systems, even two completely different robots could teach each other how to recognize a particular object or perform a new task (see “Robots Can Now Teach Each Other New Tricks”).



The Economist talks about tracking and analysing employee performance data, including its physical aspects; the MIT Review describes a scenario where robots teach robots. Now, consider a case where we mix and match the two scenarios. That is, data obtained from monitoring humans (Economist) is used to teach robots (MIT Review). The combination would enable an easy transition from lab prototypes and small-scale production created by humans to large-scale factory in robotic factories. Ultimately, it'll speed up innovation but will make lots of workers redundant.


Saturday, January 02, 2016

The new Digital Divide

The New York Times shows how mobile app designers devise new ways to get teenagers' attention during the day,
Push notifications — those incessant reminders that make your phone light up and ding — are the infantry of app warfare, cracking the attention span to remind users that someone on the Internet might be talking about them. All summer Wishbone had been sending out alerts four times a day, but the three men were thinking about adding more and, now that students were back in class, trying to recalibrate around the school day. 

“Can we have a friends feed at noon?” Mr. Jones asked Mr. Vatere. “It would be great to do ‘Your friends have updated.’ ”

“And you talk about it while you’re at school,” Mr. Pham added.

What are the implications: not for the business and advertisers, which the NYT article discusses, but for the kids, their families and the society at large?

We already know that frequent interruptions worsen kids' learning performance. We also know that pre-teens and teens are becoming addicted to their mobiles. Given that well-funded and market-savvy mobile app developers create new ways to target kids during school hours, we can predict that there will be a learning gap between kids who can manage their mobile distractions and those who cannot.

The old Digital Divide existed between people who had online access and those who had not. The underlying assumption was that the former were better off because they had access to all the information information needed to learn effectively.

I believe the assumption is no longer valid. Having access to the internet all the time is becoming detrimental to learning. Arguably, it's worse than television because kids get bombarded with distractions and advertisement all the time, rather than during the leisure hours.

The new Digital Divide is going to emerge between those who can manage their online time and those who cannot. Online learning may even broaden this divide because it will provide the motivated with greater opportunities to excel. Most likely, we already seeing signs of things to come through the low completion rates in virtual universities — 3-5%: few get huge benefits, while the majority does not. Paradoxically, online learning has become a natural selection environment for the next generation of schoolchildren addicted to their ubiquitous social interactions.

tags: psychology, mobile, learning, virtual, media, advertisement