Microfactory

Typically factories follow a serial architecture. At the beginning of the chain is the raw material. Which is then passed onto a series of stations that perform a specific task on the material. This enables large volumes to be produced and opens up possibilities to optimize material flow and tasks performed at each station. But one drawback is that it is difficult to manufacture different variants or even different products on the same production line. Serialized manufacturing units are common in a lot of industries.

Microfactories are a relatively new paradigm for manufacturing units. The factory is split into cells. Each cell is equipped with a set of robots and/or human operators that performs a series of tasks. Think of them as an averaged sum of a few stations from the conventional paradigm. These robots can be programmed to build different products and have various end-effectors for performing a set of tasks instead of just one. What this does is now each of these cells can be programmed to build a whole line of products instead of just one, without compromising on the flexibility.

Arrival, a UK based electric vehicle startup is pioneering this concept to develop microfactories to build their vehicles.

Digital Twin

A digital twin is a collection of digital data representing a physical object. The concept of a digital twin is born within engineering disciplines. A digital twin is created to optimize and test designs even before they are manufactured. The relation between the digital twin and the actual one doesn’t end there. Once manufactured, data and measurements are fed back to make the digital model more accurate.

In some cases, digital twins can be used to predict the life, wear and fatigue of an object before it happens in the actual object.

The digital twin is extended outside of an object within the autonomous driving domain. A digital world is created to mimic the physics and randomness to train and improve models that drive autonomously.

As the cost of computation and the necessary hardware required for doing so comes down, there is a possibility that the digital twin of an object could be embedded within the object.

Robots that can feel

Our skin has many types of neurons that allow us to feel upon touch. They are receptors of different kinds that are triggered by different stimuli but activate the same way upon triggering. Thermoreceptors trigger on change in temperature. Nociceptors trigger on pain. Mechanoreceptors trigger on mechanical stress. These receptors send signals to the spinal cord and the brain to register a touch.

A combination of these receptors spread across an area leads to a triggering in various nature depending on the stimuli. This is how we are able to distinguish textures and types of touch.

In general, robots are really good at predictable motions that can be broken down to a set of axis and have a known distribution of forces. This is why a robot is used for welding different parts of a vehicle body but a human operator is used to install an intricate wiring harness.

Tactile feedback would be a large improvement in the feedback loop for robots if they need even a fighting chance to learn more complex tasks. A new technology developed by a team at the University of Hong Kong allows robots to detect tactile inputs at a super-resolution.

The system uses a flexible magnetized film as the skin and a printed circuit board for the structure. The film creates a magnetic field within the device and the subtle changes in the field is sensed to determine the touch.

Bone Conduction Earphones

Ludwig van Beethoven is rumored to have invented this technology. The famous composer who was also deaf used to press one end of the rod on a piano while having the other end of the rod in his mouth. This would transmit the vibrations allowing him to hear. Rumor or not, but that is a gist of how bone conduction works.

Sound is perceived by our ears through the variation of pressure in air. The ear drum vibrates based on these pressure differences transmitting the vibrations to a set of small bones followed by a fluid filled cochlea.

Bone conduction used the bones of the skull to transmit the vibration to the cochlea directly. Bypassing the ear canal, ear drums and the set of bones in our ear. Bone conduction earphones usually sit on the cheekbones. The isolation and audio quality on these headphone are bad due to the fact that the ear is fully exposed. However, it provides people with hearing deficiencies and hearing loss another method to perceive sound.

Google Glass used bone conduction technology in it’s devices to transmit information to the wearer. X, Alphabet’s moonshot lab, is reportedly working on super hearing technology. They are working on technology that will allow the separation of voices of specific individuals from that of the group.

Apple’s shift to ARM

ARM is both a company and an instruction set used in CPU’s. ARM and x86 (used by Intel) are sets of instructions that a CPU can understand and execute. This depends on the architecture the actual silicon is built on. Hence, synonymous to CPU architecture. While Intel CPU Cores are directly sold to manufacturers. ARM, the company which might be acquired by Nvidia, licenses the standard for other companies to design chips for their own devices. This enables manufacturers to build custom hardware tailor-made for their application. ARM offers the possibility for heterogenous computing. This allows CPU’s to have different cores specifically built for different applications. A common example is a device with a multi-core architecture with specific cores for running machine learning applications.

Apple’s shift to ARM processors for the Mac line-up were in the making for quite sometime. Maybe not in the form of chips for laptops. Apple have always been a proponent of building both the software and hardware for it’s devices. Now they are going all the way. They have really understood the synergies that would bring. It had substantial cascading effects to the feature-set and user experience of their devices. But hardware is hard. Apple chose which components were strategic to it’s product in the future and invested in them. They were successful in doing that for the iPhones and iPad. These devices run on ARM processors that were designed by Apple but manufactured by other companies like Samsung

A moat around apple devices is the high switching costs. Once you are in the ecosystem, it becomes increasingly hard to switch to another platform. A main reason why they were able to make this work is the continuity it’s devices offered. They gave seamless a new meaning with features like AirDrop. It just works. Moving all their devices to ARM processors could mean they are saving even more development cost, as now they don’t have to develop for different platforms. They could migrate features and functionality on the OS and drivers they have perfected over the years in the iPhone to the Mac line-up.

From a business point of view, owning the design of chips meant that they could save a lot. Intel integrated design and manufacturing and charged heavily for their design services. Now Apple can keep those margins for themselves and outsource manufacturing to companies like TSMC.