Tag Archives: electronics

Google to acquire Dyson?

Back in 2014/2015, I wondered whether it would make sense for Google to acquire Dyson.

Growth of Alphabet/Google hardware presence

In order to keep their advantage in the search and data sphere, Google (now Alphabet) ramped up their presence in lots of emerging hardware spaces via acquisitions such as Motorola, Boston Dynamics and Nest Labs. Also, Google has developed their own technology innovations at Google X (now simply X), such as the world-leading autonomous vehicle company, Waymo.

In order to fully commercialise such acquisitions and innovations, Google needed to have access to an abundance of world-class hardware product development and marketing experience.

Google made a step towards this in 2014 when they acqui-hired a design firm based in California called Gecko Design. However, I believed Gecko Design was not big enough to fill this void alone.

This left me wondering whether Dyson would be a good fit to help satisfy this need for design engineering firepower.

Dyson’s common interests with Google

Dyson was rumoured to be working on an electric car after it acquired battery company Sakti3 (which has now been publicly confirmed) and also invested £5m with my alma mater, Imperial College, to develop next-generation robots, resulting in the Dyson Robotics Lab.

Given Alphabet’s world-leading autonomous vehicle project, Waymo, and it’s previous interest in robots, I thought that an acquisition of Dyson would give Alphabet/Google a huge advantage with its massive team of 4,800 design engineers.

Dyson and Alphabet have other visions of the future in common. One notable example is Halo (see right), Dyson’s previous prototype of a Google Glass-type device that they built 10 years before Google launched it!

Would Sir James Dyson sell?

As of 2018, a tie-up between the two companies has not yet emerged. In many ways it is unsurprising, as Sir James and his family appear to own 100% of Dyson, so why give up control? (On that topic, there is a great interview with Sir James on NPR’s How I Built This podcast about how he grew his business which explains that surprising fact).

Also, Sir James is a vocal advocate of keeping engineers in Britain and growing British talent to boost industry and our economy.

His leadership on this issue includes launching his own university with a £15m investment, called the Dyson Institute of Engineering and Technology, and his £12m donation to Imperial College to launch the Dyson School of Design Engineering (as well as the previously mentioned Dyson Robotics Lab).

In short, I’m not going to hold my breath for this one. However, it will be fascinating to see how Google and Dyson both fare in the autonomous and electric vehicle markets. Perhaps future collaboration or a joint venture could be on the cards?

MirrorMirror: booth-based 3D scanner for online shopping

During the final year (2007-08) of my Physics degree at Imperial College, we studied a module called Research Interfaces (RI). This was a team-based module that focussed on transforming scientific research into commercial business propositions.

This was a highlight of the degree for me: I loved the collaborative nature of it and the entrepreneurial challenge was much more aligned with how I wanted to live my future life.

Our product design: MirrorMirror

Our team designed a product with the working name of MirrorMirror. It was a booth containing a network of cameras with a central computer that would stitch together the images to create a 3D scan model of the user’s body.

This would then be used to generate an avatar that would help them choose clothes that fit and suit them perfectly when shopping online.

Additionally, they could see their body on a screen in real time with different clothing options projected over the image as if they were wearing it (so-called “Augmented Reality”). This reminded us of the magic mirror from the Disney film, Snow White (hence the name MirrorMirror).

There could also be other uses like tracking weight loss for dieters and muscle gain for bodybuilders (if a new scan was made regularly to show the incremental changes) or the visualisation of the results of cosmetic surgery.

Technical Design

We produced several outputs for the class including this Technical Design Review.

In that document, we estimated the cost to build the prototype of £1.45m, a total future manufacturing cost per booth of £13,900, and a price point of £50,000.

This is exceptionally high and I believe it is a result of the fact that we were not actually required by the course to do any prototyping work. If we had, I think we would have focused on looking for a cheaper way to execute the plan.

Our original design required a screen behind a half-silvered mirror. I think in 2018 this would not be required as screens are not of incredible quality and image processing technology has come on exponentially in the last decade.

User experience

We believed that there are many high-end lucrative markets (such as wedding dresses, evening wear and saris) where a quicker and less stressful garment trial process would greatly add to the shopping experience.

Our team also saw the potential for future uses such as generating an accurate avatar of the person that can be used as a little virtual model for the clothes that are being selected. Imagine being email a picture of yourself wearing the latest items from your favourite designer and a link to buy exactly the right size for you?

We envisioned that booths could be installed in shopping centres, allowing customers to create a 3D image of themselves which they could then use to shop online. Additional lucrative applications could also include high-fashion hairdressing.

Our plan of the user journey is mapped in the image below:

User Journey for MirrorMirror

Business Case and Financial Model

You can see the basic financial model we generated here: MirrorMirror Costing.

When I say we, it was actually me that had the responsibility for putting it together and I could have circulated the draft to my team-mates before the deadline so we could have had more eyes on it before submission. We got our lowest grade by far for this part of the module, so I did feel a bit guilty! However, it was apparently the same for all the other teams, so my guilt was slightly assuaged.

After 10 years working in and around startups and scaleups, here are what I see as the big errors and omissions:

  • No time series for the values (everything is static)
  • Lag time between initial burn and revenue
    • A proper cash-flow model would have helped clarify this
  • Significant errors on the business model (i.e. how we could get paid)
    • For example, would we really want to make money on the hardware, or would we prefer to make money on the service provided by the software (i.e. charge money for every image processed – a digital version of the Nespresso model)
  • No R&D tax credits, Government grants, or other potential subsidies included
  • No marketing and sales budget included at all!

It is quite satisfying to look at old work such as this and compare it with what I have learned since then!

Final Pitch

At the end of the 3-month module, we had to deliver a pitch to a packed auditorium and a simulated panel of investors (made up by the professors from the Business and Physics department that ran the course).

You can see our final pitch document here.

This was a really enjoyable part of the course. I delivered it with 2 other teammates and we got everyone in the team up on stage for the Q&A at the end.

Outcome

We actually won the Elevator Pitch Prize at the end of the module which was a very personally satisfying way to end the project. We all received a good first for the course (>85%) which was very satisfying for all of us.

We entered into the wider university’s Business Challenge entrepreneurship competition, but we didn’t get past the initial screening phase. As a result, we all agreed to disband the project outside of the RI module and did not take it any further.

What didn’t we do?

It is quite telling that we didn’t build a prototype!!!

The reason that we didn’t build anything is that we didn’t have anyone that is super-focused on the tech side i.e. that could be a CTO. I also believe it is because we all saw this as a purely academic exercise and not as a true opportunity to start an entrepreneurial endeavour and make a return with it.

This tinkering on a prototype would have actually helped us see the true costs, challenges around manufacturing, and gaps in the business model. In fact, IDEO’s Design Thinking methodology (diagram below) expressly integrates prototyping as part of the design process. This project was perfect evidence of why that is the case.

I wonder if the Blackett lab requires the students on the RI course to build a prototype as part of the course nowadays?

Design Thinking Source: IDEO Mydhili Bayyapunedi @myd | @Young_Current

WaterAlert: Plant Moisture Sensor

Back when I was 15 years old, I won a Design Technology – Systems & Control prize at my school for my work on the design process around this little product I came up with called Water Alert (see photo, left).

It was a moisture detection probe that was designed to be inserted into the soil of a pot plant and provide feedback to the gardener about when it needed to be watered.

The end result that I manufactured wasn’t high quality as you can see (!), but I remember really enjoying the design process and that enthusiasm, combined with my corresponding diligence preparing the documentation, won me the prize.

Nowadays, you can buy something virtually identical as a toy kit for kids to build themselves. It’s called the Thirsty Plant Kit (see photo, right).

This got me thinking about how I could win a school prize >15 years ago with something so simple as the design for a toy with a circuit that only has 2 transistors.

What sort of amazing school projects can kids build in the age of 3D printing, Arduino, littleBits, Raspberry Pi, and the multitude of online resources and guides?

Secret History of Silicon Valley: Steve Blank

Below is an amazing lecture from Steve Blank on the history of Silicon Valley.

As military funding was a big part of it, the majority of the talk is around the role of electronic warfare in World War II and the Cold War.

Steve’s Secret History site shares the full slide deck and more.

Some interesting highlights from the talk:

  • World War II was the first electronic war – the German air defence even had radar-guided flak guns!
  • The ground-facing radar on Allied bombers that was designed to help identify targets was used by Germany to track them (and so was the radar warning receiver on their tails)
    • This shows the cat-and-mouse game of measures and counter-measures in electronic warfare
  • Allied bomber formations would throw out a cloud of aluminium foil “chaff” to reflect German radar, which was cut to exactly half the wavelength of the signal.
  • Fred Terman of Stanford moved East during the war to run the Harvard Radio Research Lab
  • He hired 11 colleagues from the Lab to join him at Stanford when he returned. Together they made Stanford the “MIT of the West”
  • Heretically for the time, he encouraged faculty to sit on tech company boards and his graduate students to leave and start companies (for example, Hewlett and Packard)
  • The Cold War became an electronics war as well
  • The USA use the moon to pick up reflected Soviet radar signals and map out the locations of the radar bases
  • CIA and NSA would fund big radio dishes for universities like Stanford as a result
  • Shockley came back to Stanford. He was a great researcher and talent spotter but a terrible manager
  • The “Traitorous Eight” left to start Fairchild Semiconductor and a suite of companies formed in the resulting ecosystem
  • The US military “primed the pump” as the first customer for tech entrepreneurship in the Valley.
  • But in the mid-1970s, the US Government slashed capital gains tax and told pension funds they could invest up to 10% of their assets in VC firms.
    • As a result, inflows to VC firms rose by an order of magnitude and Silicon Valley became a hotbed of for-profit innovation

Protecting investors against earthquake risk in Silicon Valley

I’ve often wondered what would be the impact on companies in Silicon Valley when the inevitable earthquake hits. Turns out I’m not the only one.

Earthquakes in the Bay Area: a “ticking time bomb”

The Bay Area is subject to major earthquakes roughly every 145 ± 60 years at the current rate. Given that it is 150 years since the Great San Francisco Earthquake of 1868, the next “big one” could happen any day now.

Apparently, about 2 million people live on the Hayward Fault and 7 million are in the surrounding area. A magnitude 7 quake would cause damage in the range of $95 to $190 billion, which would be a disaster for the citizens of the area.

Impact on the tech giants

However, my curiosity centers on what would be the impact on the giant tech corporations that are based in Silicon Valley and the wider Bay Area? Companies like Google, Facebook, Oracle, and Salesforce have their HQs and major footprints in the region, so they will be adversely affected by a natural disaster.

It doesn’t seem like they are particularly well-prepared for such an event, according to this report. Although most of the companies have data centers and operations distributed around the world, an earthquake could still cause potential disruption to the main office and therefore the leadership of the business.

As listed entities, this marks a real risk for their shareholders. Could their share prices or even the whole NASDAQ take a tumble if a major earthquake hits the Bay Area? After the 9/11 terror attacks, the Dow dropped by 14%, so this is not unthinkable.

However, I think the impact goes beyond just their own businesses. The services provided by these tech titans represent critical infrastructure for many European and American businesses, so any disruption could have a huge wider impact.

Early warning: a vital tool to prevent damage

Scientists are getting better at detecting earthquakes early. In Silicon Valley, there will soon be an app called QuakeAlert that can give up to 2-20 seconds of warning of an impending earthquake.

This might not sound like much, but even 2 seconds can be long enough for Internet of Things (IoT)-enabled devices to perform vital preparations such as: opening the doors of fire stations to prevent fire engines getting stuck; isolating certain parts of the electricity, water, and gas networks; slow down trains; and tell elevators to open their doors at the closest floor.

Solution: seismic sensor network to short the NASDAQ

Could it be possible to set up a network of seismic sensors to warn when an earthquake was just about to hit the Bay Area and then send an order to a trading algorithm that could short the NASDAQ?

A similar system could be used to create an early warning for tsunamis. One candidate is the mega-tsunami that geologists once predicted could be created by a volcanic eruption in the Canary Islands which would devastate the northeastern US coast (although further review of the original study showed that this is a worst-case scenario and probably will not happen for another 10,000 years at the earliest).

LightningMaps.org: community lightning detection and mapping project

After three nights in a row of awesome thunderstorms at the end of May 2018, I became fascinated by the work that people are doing to study thunderstorms, such as the National Severe Storms Laboratory (NSSL) of the USA and another great initiative below.

I found an excellent real-time map of lightning strikes. Check it out here on LightningMaps.org.

Turns out that it is a community project of citizen scientists, which in my mind makes it even cooler!

It even shows the thunder wavefront so you can see when the sound wave of the lightning should hit your area. Genius!

The community project, Blitzortung.org, is a network of people around the world who have set up a total of more than 500 detectors (currently priced at less than €300 each) who then upload the data to some central processing servers.

The detectors are VLF (“Very Low Frequency”) receivers based on the time of arrival (TOA) and time of group arrival (TOGA) method.

You can see a coverage map of global network here.

The general documentation for the project and assembly instructions for the current generation of their detectors, “System Blue”, give useful a background on the science behind the project and what it takes to get involved.

To support their network, you can make a small donation via PayPal or credit card here.