Excel is a remarkable piece of software. It is used by people from every walk of life. Whether you are developer, data scientist, statistician, analyst. You name it. You can have these wonderful tools that can spit out different kinds of data. Or have scripts to do complex calculations. But at the end of the day, you copy it to an excel workbook to send it off to someone or to do some simple tasks like extrapolation, sorting etc.
They even had to change a DNA naming convention due to a bug in Excel.
Excel is unbundling. SaaS like Airtable and Google sheets are the biggest of them. They connect a typical column-row sheet to the internet. Opening up lot more possibilities. And these services also connect to API services like Zapier. Making it possible to connect it to other applications to read, write or manipulate data.
Excel has a very peculiar architecture. Most pieces of software get more bloated with more features. However, the opposite is true for Excel. Over the years, the amount of features introduced strengthened and expanded it’s use-cases even more. That is a very hard thing to achieve in the software world.
Software has enabled a very low cost for product variability. Unlike hardware, different configurations of software product doesn’t mean different production lines, bill of materials etc.
From a customer point of view, this can be seen in two ways. Giving them more options makes them feel that they are in control. They are actively taking part in the buying process. Instead of being skeptic based on what a sales person would tell you. On the other hand, giving them fewer options reduces the friction to the buying experience. One of the first things that Steve Jobs did after returning to Apple was to clean out the product line. Even though Apple’s current product line up is scattered all over the place, they have stuck to a smaller number of product categories.
From a business point-of-view this makes it easier to offer customers multiple products with low additional cost. These custom products can be sold in a modular way and customers can pick and choose to create the final product. The App Store works in a similar way. No two iPhones end up being similar. They can have the same hardware, but each person personalizes by installing apps to their taste.
ARM is both a company and an instruction set used in CPU’s. ARM and x86 (used by Intel) are sets of instructions that a CPU can understand and execute. This depends on the architecture the actual silicon is built on. Hence, synonymous to CPU architecture. While Intel CPU Cores are directly sold to manufacturers. ARM, the company which might be acquired by Nvidia, licenses the standard for other companies to design chips for their own devices. This enables manufacturers to build custom hardware tailor-made for their application. ARM offers the possibility for heterogenous computing. This allows CPU’s to have different cores specifically built for different applications. A common example is a device with a multi-core architecture with specific cores for running machine learning applications.
Apple’s shift to ARM processors for the Mac line-up were in the making for quite sometime. Maybe not in the form of chips for laptops. Apple have always been a proponent of building both the software and hardware for it’s devices. Now they are going all the way. They have really understood the synergies that would bring. It had substantial cascading effects to the feature-set and user experience of their devices. But hardware is hard. Apple chose which components were strategic to it’s product in the future and invested in them. They were successful in doing that for the iPhones and iPad. These devices run on ARM processors that were designed by Apple but manufactured by other companies like Samsung
A moat around apple devices is the high switching costs. Once you are in the ecosystem, it becomes increasingly hard to switch to another platform. A main reason why they were able to make this work is the continuity it’s devices offered. They gave seamless a new meaning with features like AirDrop. It just works. Moving all their devices to ARM processors could mean they are saving even more development cost, as now they don’t have to develop for different platforms. They could migrate features and functionality on the OS and drivers they have perfected over the years in the iPhone to the Mac line-up.
From a business point of view, owning the design of chips meant that they could save a lot. Intel integrated design and manufacturing and charged heavily for their design services. Now Apple can keep those margins for themselves and outsource manufacturing to companies like TSMC.
UiPath is a Robotic Process Automation(RPA) company with a $35 billion valuation. RPA is the holy-grail of automation for heavily repetitive rule-based systems. It can emulate human interactions to execute various tasks. This automation is based on software robots that can automate tasks that usually would have to be done manually by a person. RPA is quite similar to GUI testing tools, where you can see the software mimicking mouse and keyboard inputs and recording responses from the system.
There are a few moats built into this. The first one is scale. If it works on one system, it can be replicated for all systems in a business. With low marginal cost. The second one is learnability. These systems typically use computer vision algorithms to detect user interfaces to correctly determine the next step to be performed. These get better with time. The third moat is that it helps business to separate out tasks that can be made efficient by handing it over to the bots freeing up more time for it’s employees to do the harder and challenging tasks.
A key advantage of RPA is that it can be applied to almost all kinds of businesses. Expansion is possible in 2 dimensions. Both horizontally, from one vertical to another, and vertically, exploring the depths of an industry. The best part is that what it learns in one industry can then be used in another context for another interest. Essentially, what UiPath is doing is to create a toolbox of automation tasks that it can teach it’s army of robots and deploy to almost any software/tech business in the world.
The best user interface is the one that no one notices. It should just melt away in it’s function. It should get out of the way rather than pose as a hurdle to cross to achieve something. Reducing the number of clicks for each operation is a good start. Making the response time really fast (sub 100ms) also helps. But these are just tricks. What really matters is the overall concept. The concept of what needs to be shown and what can be hidden. What is really necessary for the user?. What can be defaulted or figured out in some other smart way? Visibility is key. Whatever is shown on the screen should be easily understood and discernible. Unlike the new Google icons. Colors convey a mood. A combination of colours and movement can convey an emotion.
Companies are having more and more access to data. Data from suppliers, data from intermediaries, data from customers and prospective customers. Traditional companies are not built in a way to make use of all this data. The data has to go through multiple steps and different handlers before business insights can be generated. Typically, the team that handles the data pipeline is buried somewhere under “Engineering” or “Research and Development”. However, the users of the data are spread across every vertical ranging from sales, marketing, logistics and even human resources. Don’t forget the fact that the company itself generates a lot of data everyday from its operations and employees. The crux of the problem with traditional organisation is that data is harvested and probably can be used in most parts of the organisation/ However, not everyone is well-versed with how to play around with data and make sense of it. The whole process of using data to make better decisions isn’t a one-off ceremony. Rather it is a feedback loop of varying cadence that needs close attention.
API based businesses are a game-changer. They let other business focus on things that differentiate themselves and use more “off-the-shelf” solutions for other parts of the business. APIs have permeated into many verticals like payments(Stripe), e-commerce(Shopify), messaging(Twilio), search(Algolia), automation(Zapier) just to name a few. APIs form an integral part of the SaaS model. This is the glue that holds different pieces of a SaaS software together and yet it can invariably scale. One reason why they can scale so well is that these businesses that provide an API are focused to do one thing right and really well. So well, that their solution has all the bells and whistles to support all kinds of businesses across verticals and geographies. Then the question is can companies like that have a moat around them? What is it that gives them an advantage. Since these API’s are done on a software level, in theory it would be quite easy for a company to switch to another API provider down the line. The friction would be much less compared to changing a hardware supplier and the switching costs would be lower. Then one of the most important deciding factor for customers to stick to an API provider is integrations and network effects. The more a customer uses their product the better it gets. API companies are constantly improving their services and leveraging all the usage data and statistics to build a better product. In effect, the product will only get better and that is the advantage API companies can build and serve as their moat.
Wrapped is Spotify’s year end marketing campaign. This year’s campaign was unveiled yesterday. Spotify collects and distills a year of your listening pattern and history to present the top artists, genres etc. for the year. As per Spotify’s Q1 report, they have about 286 million (MAU). That is a lot of data to process behind the scenes. This year’s Wrapped has some new features.
- In-app quizzes where you can predict your top artist, podcast.
- Story of Your 2020 with your Top Song. A recollection of your most listened song of the year.
- Deep dive into podcast listening
- New badges. Popular playlists can earn you a “Tastemaker”. Identifying a song before it becomes a hit makes you a Pioneer”. Collecting and curating songs to make playlists make you a “Collector”.
- Personalized playlist with your most loved songs and the ones you missed.
- Now you can see global listening trends here. This time open for non-users as well.
The Wrapped involves both distilling data from each user but also presenting it in a meaningful way. An article from last year gives a bit of insight on what happens in the background. Last year’s wrapped had an even larger scope where it wrapped the whole decade instead of just one year. They use a data lake backed in Google Bigtable which is optimized to aggregate data over an arbitrary period of time. Even though the amount of data was much larger, Spotify was able to reuse previously executed jobs. Each user had a row in the Bigtable with each column having result from each year. Decoupling the data processes improves the overall efficiency. User summaries broken into smaller data stories and workflows allowed for a more flexible system. Insights like songs that you might have missed would need a recommendation system that uses these data stories as inputs.
Spatial computing can be considered as an extension of IoT for objects in space. It is a concept that includes various technologies like virtual reality, augmented reality and mixed reality. AR headsets like the HoloLens use “spatial computing” to interact with real objects around them. A futuristic scenario is when the objects around you are not only connected to a common media like the internet but also orchestrate themselves to achieve a higher level goal. Or if you could interact with objects that are thousands of miles away or fix machines that are in the ends of the planet.
For a technology like this to manifest there should exist a high speed, low latency network that will allow this kind of high bandwidth communication across multiple objects and the subsequent processing of that data. Another prerequisite could be the need of a common language that these disparate objects and underlying technology can use to talk to each other. Lastly, the use-cases need to be simplified and dumbed down to make it profitable and viable for development and innovation. A use-case that is quite common now an array of sensors spread in the aisles of a supermarket to better understand the behaviour of the customers.
Simple intelligence is cheap. It can be replicated with minimal hardware and a few lines of code. It can be as simple as a self-adjusting thermostat. Embedding intelligence into everyday objects is not a new concept. The “internet of things” connects these local devices to each other. It introduces a medium to use and a language to communicate with. Taking this one step further, the devices not only talk to each other but they talk to a central server that monitors them and tries to identify usage patterns and behaviourial quirks of the user. This makes it more of a collective intelligence. Even though “recent” developments in web and networking technologies and with the excessive number of abstraction layers, it seems to be seamless on the surface. However, underneath the hood there are still many interfaces, handovers, and conversions taking place.
Technological development of this scale spans over many years and is spread across the world, split between large companies, startups, government regulations and various standard setting organizations. But if we were to start from scratch and built it again, would we choose the same interfaces and handover mechanisms. Or would we go for something much more simpler. Considering not just the top level abstractions like REST and other API’s of that sort. But the complete SW stack right down to the metal.