The Lindy effect is a phenomenon where the future life expectancy of an item increases with each year survived. Counter-intuitive as it may seem, there are a few explanations to this effect. The effect mainly applied to non-perishable items. Items that do not have an expiration date governed by unstoppable forces. Time plays an imprtant role in this effect. Longer an item is in play, the better fine-tuned it becomes. More equipped it will become to survive the coming times.
Books are a good example where this effect can be applied. If a book has been repreinted for the last 40 years, there is a good chance that it will be for the next 40 years. The explanation being that the content of the book has seem to find appeal to a demographic that is renewed over time. Though the medium might change from printed books to kindle to audiobooks.
Items or systems that have built a self-enforcing loop around it tend to follow the lindy effect. Businesses with moats around them are an example. They have a competitive advantage that compounds over time. Such businesses become harder to fail as time passes by.
Choosing what to read can be guided by the Lindy effect. Choose books that are foundational in the area to start with. Coincidentally, those books have been around for some time. The same applies when doing a literature review for some research. Start with survey papers. They, by definition, cannot go out of trend. Hence the popularity of survey papers.
Typically factories follow a serial architecture. At the beginning of the chain is the raw material. Which is then passed onto a series of stations that perform a specific task on the material. This enables large volumes to be produced and opens up possibilities to optimize material flow and tasks performed at each station. But one drawback is that it is difficult to manufacture different variants or even different products on the same production line. Serialized manufacturing units are common in a lot of industries.
Microfactories are a relatively new paradigm for manufacturing units. The factory is split into cells. Each cell is equipped with a set of robots and/or human operators that performs a series of tasks. Think of them as an averaged sum of a few stations from the conventional paradigm. These robots can be programmed to build different products and have various end-effectors for performing a set of tasks instead of just one. What this does is now each of these cells can be programmed to build a whole line of products instead of just one, without compromising on the flexibility.
Arrival, a UK based electric vehicle startup is pioneering this concept to develop microfactories to build their vehicles.
A digital twin is a collection of digital data representing a physical object. The concept of a digital twin is born within engineering disciplines. A digital twin is created to optimize and test designs even before they are manufactured. The relation between the digital twin and the actual one doesn’t end there. Once manufactured, data and measurements are fed back to make the digital model more accurate.
In some cases, digital twins can be used to predict the life, wear and fatigue of an object before it happens in the actual object.
The digital twin is extended outside of an object within the autonomous driving domain. A digital world is created to mimic the physics and randomness to train and improve models that drive autonomously.
As the cost of computation and the necessary hardware required for doing so comes down, there is a possibility that the digital twin of an object could be embedded within the object.
Complexity theory and related concepts emerged quite recently, towards the late 20th century. Complex means diversity, multiple concurrent interdependencies between different elements of a system. Human cells in itself are a very complex system. However they can self-organize and form groups. Create variations and give rise to even more complex systems like a human.
Complexity theory studies complexity systems. A system composed of many components that interact and have different dependencies with each other. Key concepts include systems, complexity, networks, nonlinearity, emergence, self-organization, adaptive.
A business is also a highly-complex system. Comprising of complex individuals both in an around the business interacting with it simply by creating products, selling/buying them.
InterPlanetary File System(IPFS) is a protocol that can be used to store and share data in a peer-to-peer system. Similar to HTTP, the protocol on the internet to share and store information. One problem with HTTP is that the computer requesting for a webpage has to directly establish connection to the server storing it. Therefore transactions are serial and singled-out.
IPFS on the other hand can download a file from the network from multiple sources. IPFS creates a a cryptographic hash for all the blocks within a file. These blocks can be stored in multiple and different locations. IPFS uses a decentralized name server IPNS (equivalent to DNS for HTTP) to map human readable filenames to these cryptographic hashes. This allows a the requesting node to download several pieces of the file simultaneously from different locations.
IPFS essentially decouples the content from the addressing on a network. This would allow the requesting node to download a file from the nearest possible location instead of being bottlenecked to one location.
Berkson’s Paradox is a type of sampling bias. This paradox can lead to experimental studies concluding that 2 events are related when they are actually not. This was first identified in case-controlled studies.
A study by Sackett(1979) portrays this paradox quite well. He wanted to study the presence or absence of respiratory disease and locomotor disease. He took 2 samples. One from a random community from the general population and one from the hospital. When looking at the results from the hospital sample, the data points out that the it is much more likely to have a locomotor disease if you have a respiratory disease. But it is not true. This correlation emerges from the fact that it is more likely to have patients that are admitted for both diseases in the hospital, while we are not considering the part of the population that neither has a respiratory disease nor a locomotor disease.
When looking at the results from the community sample it was clear that there is no correlation between the 2 disease.
Control group based studies are common in many studies not only restricted to healthcare. Especially around studies to evaluate industry and consumer trends. It is important to look at where the data is coming from i.e. who is in the sample population and how that relates to the big picture.
Blockchain are now widely used to perform transactions between two parties where the blockchain ensures the validity of the transaction and records it in the distributed ledger. However, for a transaction to be completed it would require all the individual peer copies of the ledger to be updated. And this can be very time and resource intensive.
Enter the Lighting Network.
The Lightning Network is a protocol that works on tops of a blockchain that allows for transactions to happen off the chain. The protocol allows for a payment channel to be open between 2 parties which is validated by the blockchain. Once a payment channel is set up, all transactions between the 2 parties can happen quite instantaneously. The record of the transaction only needs to be updated in a certificate tied to the payment channel. Once a payment channel is closed, it is recorded in the main blockchain.
An example, a customer can have a payment channel open with a coffee shop and they can use it to pay for coffee everyday. Once the payment channel is closed, say once in a year. The records are ratified and added to the main blockchain.
What the internet together with the smartphone has now done is that it democratized accessibility. Any website, article, tweet, video, image, product, service on the web can be read by anyone in the world. But this has taken out the geographical localization from the experience of buying something or experience a product.
In most cases it is more efficient to just order for food in an app rather than to physically go to a restaurant and pick up an order. The same applies to e-commerce, flight and hotel booking etc. Businesses that rely on the hyperlocal aspect for its business still need a network of employees to run its operations. Whether it is a delivery/logistic arm of Amazon or distributed pockets of people who are ready to rent out a space to strangers.
The scalability of such systems on top of a stack like the internet is huge. Nothing new here. But what that means is that, for a given reward structure, a biggest fish WILL have to emerge in almost every pond. Not completely extinguishing it’s competitors. But leaving enough room for bespoke peers in the market.
An example is Amazon followed by all these niche e-commerce websites. Like Google followed by the rest.
Our skin has many types of neurons that allow us to feel upon touch. They are receptors of different kinds that are triggered by different stimuli but activate the same way upon triggering. Thermoreceptors trigger on change in temperature. Nociceptors trigger on pain. Mechanoreceptors trigger on mechanical stress. These receptors send signals to the spinal cord and the brain to register a touch.
A combination of these receptors spread across an area leads to a triggering in various nature depending on the stimuli. This is how we are able to distinguish textures and types of touch.
In general, robots are really good at predictable motions that can be broken down to a set of axis and have a known distribution of forces. This is why a robot is used for welding different parts of a vehicle body but a human operator is used to install an intricate wiring harness.
Tactile feedback would be a large improvement in the feedback loop for robots if they need even a fighting chance to learn more complex tasks. A new technology developed by a team at the University of Hong Kong allows robots to detect tactile inputs at a super-resolution.
The system uses a flexible magnetized film as the skin and a printed circuit board for the structure. The film creates a magnetic field within the device and the subtle changes in the field is sensed to determine the touch.
Market capitalization is the total cash-value of a business. AKA market cap. The market cap is calculated by multiplying the current number of shares outstanding by the price of one share. For example, a company having 1000 shares of 10$ value would have a market cap of 10,000$.
Companies are usually classified based on their market as small-cap, large-cap etc. Microcap refers to a companies that have much smaller capitalizations. In the U.S. it refers to companies having a market cap of roughly 50-300M$.
Microcap companies are not that popular. And not that well.covered by mainstream media. Hence the volumes traded on these stocks are quite low. This can be risky as it might be hard to offload or buy into large positions.
Since the companies are not that well-covered, the data on these companies are generally good. Less noise overall. And these companies are usually easier to analyse as they have simpler businesses and product lines compared to larger companies.