Once again, from the perspective of history, looking at the systems and technologies that have been labeled "digital" in those years from the perspective of evolution, they all seem to be evolving in a certain or even the same direction in terms of structure. How were these technologies born, why did they appear, and what are the current "digitalization" propositions talking about? Can we get a little bit of clues in the vast technological universe to better understand the world?

 

"No one can tell you what the matrix is, you have to look at it yourself"

From the perspective of empiricism, the concept of "digitalization" in most people's minds is analogous and self-interpreted with the concept of the early "informatization" era and the "APP" of the mobile Internet era. Because the cognition we obtain in reality is mainly derived from perceptual experience. If you don’t believe me, look, the grandmother’s social network are also chatting about which e-commerce reseller is cheaper. In fact, their interaction with each other is also a digital business scenario in the retail industry.

However, in cloud computing, a world that is not intuitively understood by most people, digitization may mean another way of expression. And this way of expression may be the key to crossing the knowledge gap and well-understanding the digital world.

The exponential growth of the online population is the fundamental driving force of "digital transformation". Although the Internet is a high-entropy world, while this economic form is formed, in addition to having a profound impact on user experience, business models and supply chains, it also has a huge impact on the construction structure of the information system behind it. Influence.

 

"When the common basic theories, concepts, and practical methods change together, paradigm shifts follow one after another."

Simply put, for a software or APP to run correctly, it needs a back-end service system to support it, just like the relationship between the front desk and the back kitchen of a restaurant. In this “back kitchen”, at least components such as application services and databases will be included. When the users and functions of this APP change in a small range, the growth of these components changes linearly. However, when users or functions change on a large scale (such as exponential growth), the structure of this "back kitchen" system has undergone a fundamental structural change. By analogy, it's probably in the form of McDonald's and Uber Eats.

This structural change occurs in various businesses in all walks of life, making these different "back kitchen" systems more and more complex and huge. For this reason, computer scientists designed a centralized and huge computer building to house these "back kitchen" systems, which people call "Cloud Compute". When these systems are in the cloud, another interesting problem has surfaced. Think about it, what is the technology connection between the live room reseller and the bio scientist doing protein interaction analysis? Why can these businesses be carried out in the same cloud?

 

If technology is a target system from a macro perspective, then the IT infrastructure and applications (APP) that we build for a certain purpose also have their own tendencies.

The evolution of scale and complexity has made IT systems have many tendencies in their demands for resources. In my opinion, the most significant ones are compute and throughput. Therefore, I tried to use these two dimensions to build a basic tool that links digital business and cloud computing workloads.

The compute-throughput model shown in the figure above is used to match digital business scenarios. It links the business-level scenario representations with the system-level resource tendencies to help us clarify the relationship between digital business and products.

When the business model leans toward the computing power of the system, its structure becomes suitable for processing computationally intensive tasks. For example, those data-intensive scientific compute are used for 3D or Metaverse rendering, large-scale data ETL/ELT and query, machine learning, etc. Therefore, these businesses usually require high-performance computing, GPU cores, large-memory data warehousing, high-performance data computing middleware (such as Photon\Spark) and other resources.

When the business model tends to system throughput, its structure will become suitable for handling large-scale throughput tasks. For example, e-commerce applications in promotional activities; social media apps that connect everyone; stock quotes and transactions; live broadcasts and videos. Thus, these services need resources such as global network and content distribution, large-scale application containers and orchestration, high-performance message processing, and distributed database clusters etc.

Finally, the purpose of the computing power-throughput framework is to simplify the understanding of the business of the digital world from the perspective of the IT system structure. It is hoped that through it, some general conclusions can be drawn. My principle has always been to invent with purpose.