Three opportunities of Digital Transformation: AI, IoT and Blockchain
The emerging technologies of artificial intelligence (AI), the Internet of Things (IoT), and blockchain represent an exponential power of three opportunities for private enterprise as well as the public sector. Enterprises capable of exploiting these technologies will use them to optimize and enhance existing processes, create new business processing models, and develop innovative products and services for a new generation of consumers/users. They do not represent a technology-enabled future that is decades away; these technologies are available today to build the businesses of tomorrow, based upon how fast the technologies and their development environments mature and become interoperable.
--
The personal computing revolution of the 1980s, the Internet revolution of the 1990s, and the mobile devices revolution of the 2000s put virtual supercomputers in the hands of average citizens, and these transformational technologies have changed the world. That said, each of these technologies emerged gradually and in isolation. We had time to develop the use of personal computing before the Internet arrived and changed the game once more. We were Internet-smart long before smartphones put the Web in our pockets. Now, multiple new technologies are emerging at once: AI chatbots such as Alexa and Siri, IoT-driven supply chains, token-based blockchain ecosystems, 5G wireless, autonomous vehicles, and more, are all available today. Although each of these technologies presents exciting opportunities and significant challenges for the enterprise, we consider AI, IoT, and blockchain to be truly transformational. Alone, any one of these three would have the power to alter business, leisure, and society. But together (Figure 1–1: AI, IoT, and blockchain: connected trustful insights), their transformative impact will be unprecedented.
Briefly, IoT is a system of connected things, which can include anything with an associated Internet address that makes it part of a network. Things are interrelated computing devices, mechanical and digital machines, objects, or even people that are provided with unique identifiers, such as IP addresses. Using these identifiers and IoT, a device has the ability to transfer data over a network without requiring human-to-human or human-to-computer inter- action. The left side of Figure 1–2 shows the components and processes included in an IoT server system. Today, manufacturing, transportation, and healthcare devices that execute these processes can be monitored and controlled using cloud-based IoT network software. Consumer application of IoT abounds as well. We use smartphone-controlled devices to monitor our health and control our home environment. We can communicate remotely with household de- vices and appliances via our mobile devices.
Briefly, IoT is a system of connected things, which can include anything with an associated Internet address that makes it part of a network. Things are interrelated computing devices, mechanical and digital machines, objects, or even people that are provided with unique identifiers, such as IP addresses. Using these identifiers and IoT, a device has the ability to transfer data over a network without requiring human-to-human or human-to-computer inter- action. The left side of Figure 1–2 shows the components and processes included in an IoT server system. Today, manufacturing, transportation, and healthcare devices that execute these processes can be monitored and controlled using cloud-based IoT network software. Consumer application of IoT abounds as well. We use smartphone-controlled devices to monitor our health and control our home environment. We can communicate remotely with household de- vices and appliances via our mobile devices.
AI, with respect to its subset, machine learning (ML), is a system’s ability to interpret and learn facts correctly from external data. System capabilities generally classified as AI include, but are not limited to, interpreting human speech, driving autonomous vehicles, and intelligently routing content in con- tent delivery networks. For example, personal assistants such as Cortana, Siri, and Google Assistant are now a major feature of smartphones and tablets. Amazon uses AI to forecast demand for everything the company sells world- wide, thereby optimizing its fulfilment and delivery processes. IBM’s Watson Predictive Healthcare uses AI to integrate and analyse clinical, administrative, and claims data from multiple sources in near real time and includes the ability to insert the relevant integrated data and analytic insights into the workflows of the care team members so they can use the information in patient care. The IoT network connects to the cloud, which has protocols that collect data from things located all over the globe. IoT rules engines route data to the appropriate application, which uses the blockchain and other databases to store the data (Figure 1–2). AI uses the data to feed ML algorithms, which analyse the data and perform some action or produce a result. AI’s role and process flow are described in more detail in Chapter 3. The IoT connects to the cloud and blockchain (and other scalable data stores for less sensitive data) to persist the data, and AI uses ML algorithms to analyse the data and perform some action or produce a result. Blockchain is the secure, encrypted, trustworthy, and distributed peer-to-peer data store that will be used. In some ways, the ultimate success of the new tech ecosystems begins and ends with the blockchain. Blockchain technology rebuilds the foundation of the Internet to restore trust and reliability. One fundamental problem preventing the further connection of the IoT and AI is the security vulnerability inherent on the Internet in its current form. The blockchain and distributed ledger technologies, the new databases, are decentralized, with built-in security. Presently, traditional databases such as SQL (Structured Query Language, a domain-specific language and associated relational data management system) are used to store most of the data in the world, and a few key vendors store most of the world’s data. IBM DB2, Oracle, and Microsoft SQL Server hold almost 90 percent of commercially available database management systems (see https://www.softwaretestinghelp.com/database-management-software/). It is no secret that SQL databases are a key target for cybercriminals because of the relative ease with which they can be breached, coupled with the valuable nature of sensitive information locked away inside. Whether the data is financial or intellectual property and corporate secrets, hackers worldwide profit by selling data obtained from breaching organization servers and plundering SQL data- bases. How does the blockchain solve this problem? Simply put, a blockchain is a database encompassing a digital chain of encrypted, immutable, fixed-length blocks that include 1 to N transactions. The blockchain processing, as depicted in Figure 1–3, starts where each transaction — such as a request from AI-enabled software to a blockchain-based “smart contract” or an IoT network request that is collecting data about the status of a goods shipment — is validated and then inserted into the block. Using a consensus algorithm, which ensures that the next block in a blockchain is the one and only version of the truth, the block is then validated and added to the end of the existing chain of blocks. Once recorded, blocks are designed to be resistant to modification; the data in a block cannot be altered retroactively. Moreover, decentralized control eliminates the risks of traditional centralized control. Anybody with sufficient access to a traditional centralized database can destroy or corrupt the data within it. Users are therefore reliant on the security infrastructure of the database administrator. Blockchain technology uses certificate authority and decentralized data storage to sidestep this issue, thereby building security into its very structure.
There are different types of blockchains, which we will explore in detail. A public blockchain, such as bitcoin, is an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way. Using a peer-to-peer network and a distributed timestamping server, a public blockchain database is managed autonomously. In contrast, in some private blockchains, the blockchain is shared among an ecosystem of permissioned users using an access control layer built into the protocol. This means the private blockchain network participants have control over who can join the network and who can participate in the consensus process of the blockchain. Private blockchain applications include supply- chain ecosystem applications, for example, where producers, suppliers, distributors, and consumers share a private blockchain and perhaps a common utility token to transact. The token helps reduce the friction, such as fees and delays, created by banks and government agencies that are in the middle. Private blockchains provide solutions to financial enterprise problems, including asset data stores and land deed registrations, and they facilitate compliance agents for regulations such as the Health Insurance Portability and Accountability Act (HIPAA) and anti-money laundering (AML) and know your customer (KYC) laws. Gartner, in mid-2018, puts these technologies in the so-called “Trough of Disillusionment” (https://www.gartner.com/smarterwithgartner/5-trends-emerge-in-gartner-hype-cycle-for-emerging-technologies-2018/). As you read this book in 2019 and 2020, AI, blockchain, and the IoT will be on the “Slope of Enlightenment” and moving toward the “Plateau of Production” (see Figure 1–4). As with all technology, the architecture and development environments need to mature before the technology is used on a grand scale. So, as these three technologies mature, they will fulfil the promise of the trinity. They will provide all of the components required for end-to-end connectivity, accountability, decision-making, as well as convenience for a host of business, governmental, and life experience applications.
In addition, these technologies will face legal and regulatory challenges. How will governments regulate systems combining AI, the IoT, and blockchain? When it comes to data, the U.S. government has been, comparatively speaking, reluctant to regulate. There exists no U.S. legislation as far reaching as the European Union’s Global Data Protection Regulation (GDPR) enacted in May 2018. This “hands-off ” approach is perhaps a proper action (or inaction), as early regulatory intervention can forestall or even foreclose certain paths to innovation. The hope is that the new innovators will work to develop a code of conduct and a culture of self-enforcement to avoid hindering the widespread adoption of these technologies with restrictive and stringent government regulation. Key among these considerations is how to regulate and protect the collection of the massive amounts of data that AI, the IoT, and blockchain will foster.
The core issues to consider are the misuse of data resulting in the following:
- The risk of bias and discrimination
- The potential for intrusions of privacy
- Mass surveillance that may encroach on democratic freedom
- Secure key infrastructure and governmental operations from all adversaries Data is, of course, the lifeblood of AI, the IoT, and blockchain.
AI ML algorithms need data to learn and become asymptotically accurate. Big business knows “big data” is their lifeblood, but only recently have average consumers become aware of just how critical and valuable their data is to big business and its profit margin. For years, “free” services such as search engines, e-mail, and social media proliferated. Consumers use such services for convenience, and they require no monetary payment. In turn, consumers are exposed to ads and agree to allow collection of their private data. The collected data is of immense value to service providers, because big business can sell it to affiliate marketers that use it for targeted advertising. The GDPR, enacted in 2016, took effect with important implications for businesses operating internationally. California passed a similar measure — the California Consumer Privacy Act of 2018 (CCPA). These laws regulate how companies can collect and sell personal data and give individuals substantive rights regarding their data. Here is again where blockchain provides a solution. It gives individuals power over their data. Companies such as doc.ai, Datum, Wibson, and Ocean Protocol enable users to sell their personal data for cryptocurrency. Doc.ai compensates users supplying personal medical information for use in neural networks. Internet visionary Sir Tim Berners-Lee is developing a decentralized Internet ecosystem called Solid. According to its web site (https://solid.mit.edu/), Solid will let users control what happens with their data, and developers can build apps that leverage the data while preserving individual rights. Consider the use of AI in policing and surveillance. Law enforcement officials and civil rights advocates each make valid points on this topic. On one hand, AI can help solve or prevent crime in ways not previously possible. On the other hand, AI should not be used for improper discrimination or unwarranted privacy intrusions. AI has impacted and facilitated policing and surveillance. Automated license plate readers take images of vehicle license plates; capture the dates, times, and GPS coordinates of the vehicle; and upload them to a database that law enforcement can access. Technology provided by Vigilant Solutions (https://www.vigilantsolutions.com/), for example, implements analytics that make sense of the data. “Predictive policing” uses data such as crime databases and social media to predict where crime is likely to occur, and which persons will likely commit violent crimes. Amazon also licenses face- recognition software to law enforcement. Amazon and the American Civil Liberties Union (ACLU) have engaged in a debate about whether the technology is flawed and biased (https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-disturbing-plan-add-face-surveillance-yo-0). Google used AI to help the Pentagon analyse drone footage (see “Project Maven”
https://dod.defense.gov/News/Article/Article/1254719/project-maven-to-deploy-computer-algorithms-to-war-zone-by-yearsend/). Google employees pushed back, concerned that the military would weaponize its AI in connection with drone strikes. In response, Google declined to renew its contract with the Pentagon and published ethics guidelines for using AI. Imagine the application of AI and the IoT to warfare (see Figure 1–5 and https://www.japcc.org/electronic-warfare-the-forgotten-discipline/). Truth is, it is happening now. Understanding the implications of the IoT and warfare will reveal its significant impact on future conflicts. For example, having an adversary monitor your communications or eliminate your ability to communicate or navigate will be catastrophic. Likewise, having an adversary know the location of your deployed forces based on their IoT transmissions will put your forces at a substantial disadvantage. The military’s ability to quickly correlate, evaluate, and create value from data will be key.
The resulting capabilities include the following:
- Planning capability to locate sensors and weapons systems optimally to counter identified threats
- Situational awareness of the evolving battle and status of defensive as- sets at all leadership levels
- Battle management to pair sensors and shooters optimally for effective defense against multiple threats and efficient asset utilization and engagement
- Sensor netting to detect, identify, track, and discriminate threats
- Global engagement management to enable war fighters to adjust defences to the emerging battle
- Global communications networks to manage and distribute essential data efficiently
In matters like these, balancing the relevant interests is the objective. This book explores how you can not only prepare your business for the IoT, AI, and blockchain, but also empower your organization to exploit the “power of three” now and in the future. You’ll learn how to prepare for the legal and regulatory issues that must be addressed for successful implementation.
The Confluence of the Three Technologies
Why is this new “power of three” technology confluence possible? It is because of major upgrades in computational power, aligned with the petabyte amounts of data being created. (A petabyte is 250 bytes; 1024 terabytes, or 1 million gigabytes. A gigabyte is about the size of a two-hour streaming digital movie.) Consider the enterprise Trimble as an example. The transportation software giant is combining big data, the IoT, AI, and blockchain technologies to reduce costs and increase efficiencies. Large amounts of data are collected and imported from internal systems and various transportation devices; AI ML models are drawn up from the data and insights are recorded. The data is stored on a blockchain platform (see https://hortonworks.com/blog/big-data-powering-blockchain-machine-learning-revolutionize-transportation-logistics-industry/). Let’s look at three of the technology laws that predicted this confluence: Moore’s law, Koomey’s law, and Metcalfe’s law.
Moore’s law This law should be familiar to anybody following the technology sector. Described by Alan Moore in 1965, it essentially posits that the number of components on an integrated circuit — a chip — doubles every year (see Figure 1–6 ). This law has proven to be remarkably stable since its inception. Prices per unit of computation have come down remarkably, as ever more computation can be put into the same circuit package and, more importantly, for the same price. This means we have very inexpensive integrated circuits. (See https:// ieeexplore.ieee.org/abstract/document/347359)
Koomey’s law This law posits that the energy efficiency of computation doubles roughly every one-and-a-half years (see Figure 1–7 ). In other words, the energy necessary for the same amount of computation halves in that time span. To visualize the exponential impact this has, consider the face that a fully charged MacBook Air, when applying the energy efficiency of computation of 1992, would completely drain its battery in a mere 1.5 seconds. According to Koomey’s law, the energy requirements for computation in embedded devices is shrinking to the point that harvesting the required energy from ambient sources like solar power and thermal energy should suffice to power the computation necessary in many applications.
Metcalfe’s law This law has nothing to do with chips, but all to do with connectivity. Formulated by Robert Metcalfe as he invented Ethernet, the law essentially states that the value of a network increases exponentially with regard to the number of its nodes (see Figure 1–8 ).
This is the foundational law upon which many of the social networking platforms are based: the value of an additional user does not increase the value on a linear scale, but rather increases the number of connections, and thus the value for all users. So, a network with a hundredfold more users has 10,000 times more value; a thousand fold more users yield 1 million times more value, and so on.
But what do these measurements mean with regard to blockchain, the IoT, and AI? Moore’s and Koomey’s laws make embedding chips into almost everything both economically viable and technically feasible. As the chips get smaller and cheaper, their energy footprint decreases dramatically. But Metcalfe’s law implies a strong incentive to implement these measures, because the more nodes we connect to the network, the more valuable the network becomes and the more value we can derive from the network. Networks are the essential components defining our age and the next. In communications theory and economic practice, it is the number of connections on a network that is the core determinant of impact. This is the key. As networks expand, their impact expands even faster. Metcalfe originally intended the concept to capture a qualitative rather than a precise quantitative effect. Nonetheless, there has been some academic dispute (see https://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong) over the accuracy of the Metcalfe’s law as an economic predictor. That said, con- sider the impact of the telephone versus its predecessor the telegraph, and the mobile Internet versus the desktop Internet. In 2013, Metcalfe published an article (see https://www.computer.org/csdl/mags/co/2013/12/mco2013120026-abs.html) showing that Facebook’s market value has in fact tracked closely to the square of the growth in its users. If we look at the scale of connections of the four communications revolutions, we get the telegraph in the thousands, the telephone in the millions, the smartphones in the billions, and, with the emergence of the IoT and all the ultimately connected things on the planet, the scale is now in the trillions. A critical threshold was crossed in 2008, the year that more things than people were connected to the Internet. Early forecasters expected a trillion connected things by 2018, although that didn’t happen. Best estimates put the IoT world today at from 15 billion to 25 billion connections (https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide). Networks start small in part because they begin when the associated technologies are good enough to provide a proof of concept (PoC), and early production models are not yet mature and performant enough to fully scale and displace current production functionality.
How We Will Interconnect the Three Technologies
As the required technology and network connectivity are fast becoming available to support this trinity, how do we engineer these components to work together? To understand how IoT, AI, and blockchain work together, you can think of them as being like interconnected organic processes, analogous to human body processes (see Figure 1–9).
We can equate AI with the brain that controls all the functions of the body. When these technologies mature, AI will control all the functionality in the IoT network using the knowledge stored in the blockchain memory. We equate the IoT network with the nervous system that sends messages back and forth from the brain to different parts of the body. The IoT, when mature, will be able to direct devices and sensors to message and alert AI components via a network when events happen. The IoT device data will be collected and analysed by AI components. The blockchain, also known as distributed ledger technology (DLT), is the data store. It is like the body’s memory, which holds all that we experience. This fast-maturing technology will provide the trust, speed, interoperability, security, and reliability needed to store and access all the things required to support the ecosystem to which it belongs.
In the human brain, a neuron collects signals and then sends out spikes of electrical activity through a long, thin fibre called the axon, which splits into thousands of branches. When a neuron receives input, it sends electrical activity down its axon. Learning occurs by changing the synapses, which creates memory. In AI and blockchain smart contracts, IoT data transfers are collected. Like neurons, the AI bot or blockchain smart contracts then persist in memory and perhaps send instructions — so, for example, if the goods have arrived, payment is sent. Like a brain, AI performs the logic, or reasoning. It analyses data and makes decisions. The IoT network senses actions and events in its surroundings and interacts, via data messaging protocols and IoT servers, with an AI component to provide the basis for decisions. The blockchain is the distributed memory that creates a secure, immutable record of transactions and associated data (see Figure 1–3). Because the cost of storing every piece of data on a blockchain is inefficient, typically only small amounts of data and hash pointers are stored on the blockchain. The hash pointers are used to locate associated data off chain. Multiple IoT networks can exchange data, while the power of AI will be exponentially enhanced with more data. The trinity of these technologies will not only help increase efficiency, but it will help businesses deliver better customer service. For example, suppose an autonomous vehicle crosses a toll bridge. The bridge has IoT sensors that pick up the vehicle’s license plate and requests the bridge toll. The vehicle’s AI functionality registers this and uses the blockchain to send the toll to the toll collection application. The trinity will require development teams with diverse and comprehensive skill sets in business, law, and technology. There will be challenges around all the tech giants collaborating to build AI, the IoT, and blockchain software and hardware that are interoperable, compatible, scalable, and reliable. Interoperability is the ability of different information systems, devices, and applications to connect within and across organizational boundaries to access, exchange, and cooperatively use data. Compatibility is the capacity for two or more systems to work together without having to be altered to do so. Scalability is an attribute that describes the ability of a process, network, soft- ware, or organization to grow and manage increased demand. Reliability is a must-have attribute of any ecosystem component, be it software, hardware, or a network, that consistently performs according to its specifications. That said, there will be great opportunities for businesses to create protocols and standards to facilitate the maturation and ultimate production of this new technology confluence.
AI: The Brain
As mentioned in our analogy, AI is like a brain that provides logic and communication. AI is an area of computing science that emphasizes the creation of intelligent software and hardware that works and reacts like human brains. An AI neural network is designed to simulate the network of neurons that make up a brain (see Figure 1–9), so that the computer will be able to learn things and make decisions in a humanlike manner. Some of the activities AI is currently designed for include speech recognition and some degree of learning, planning, and problem-solving. AI is a big idea with an equally big opportunity for businesses and developers. AI is slated to add $15.7 trillion to global gross domestic product (GDP) by 2030, according to research by PwC (see https://press.pwc.com/News-releases/ai-to-drive-gdp-gains-of--15.7-trillion-withproductivity--personalisation-improvements/s/3cc702e4-9cac-4a17-85b9-71769fba82a6). AI is already transforming how companies process vast amounts of information. As you might expect, a good deal of this processing is being done in the cloud. The cloud is a network of remote servers hosted on the Internet to store, manage, and process data. Microsoft offers Cognitive Services to developers and companies using its Azure cloud computing platform. These services include data analysis, image recognition, and language processing, which require various forms of AI to complete tasks. Microsoft’s Azure Machine Learning Studio (see https://studio.azureml.net) is an AI development tool that analyses big data quickly and efficiently. If you use Google’s new Photos app, Microsoft Cortana, or Skype’s new translation function, you’re using a form of AI on a daily basis. Autonomous driving is the most daunting example that gives cars the ability to see, analyse, learn, and navigate a nearly infinite range of driving scenarios. AI enables cars to learn how to drive on their own — which could bring about a $7 trillion to the autonomous driving economy over the next three decades (see https://www.wired.com/story/guide-self-driving-cars/). In Phoenix, Arizona, Alphabet has launched Waymo One, the first commercial autonomous vehicle ride-hailing service. Waymo will license its AI autonomous driving technology to automakers and use it for package delivery ser- vices and semi-truck transportation as well. Sceptics may assume that AI is just a marketing buzzword for tech companies. But the big technology companies have been heavily investing in AI and ML for years, and the products and services listed here are tangible evidence that these tech giants are already making money from AI. When it comes to advancing AI, hardware may provide answers. Specialized GPU chips enable companies to process complex data and visual information quickly, which has made them ideal for AI cloud computing. GPU-accelerated computing is the employment of a graphics processing unit (GPU) along with a computer processing unit (CPU) to facilitate processing-intensive operations such as ML, analytics, and engineering applications (see https://www.nvidia.com/en-us/about-nvidia/ai-computing/). That said, as AI matures, we should always choose artificial intelligence over natural stupidity.
Advancements in AI
The point at which AI-assisted machines surpass human intelligence is predicted by visionaries such as Ray Kurzweil to arrive by 2045. MIT’s Patrick Winston puts the date at 2040. In a recent survey conducted with scientists and computer techs, respondents were quite positive that this will be achieved even sooner, with 73 percent of tech execs saying this moment will arrive within a decade and nearly half of tech execs believe this will occur within five years. Perhaps the earlier prediction is a result of the impressive pace of technology development that could cause people to overestimate technological capabilities or achievements (see https://www.edelman.com/sites/g/files/aatuss191/files/2019-03/2019_Edelman_AI_Survey_Whitepaper.pdf). Famous computer scientist Alan Turing, who is regarded by some as the originator of AI, devised the “Turing test.” To pass this test, a computer or robot is required to interact with a human in such a way that the human cannot tell it apart from another human. Chatbots, which mainly use text communication, and voice-only AI systems such as Siri, Cortana, and so on, have come a long way toward being more humanlike in their conversation. But by most ac- counts, they have not as yet passed the Turing test. That said, another concept that is probably more appropriate here is the uncanny valley, which refers to the point at which a computer or robot displays some humanlike features but is, ultimately, recognizable as a machine. Although Siri and other chatbots may be impressive residents of the uncanny valley, one of the most telling examples of AI are the robots made by Boston Dynamics (see https://www.bostondynamics.com/). We can see them walking, running, opening doors, and performing other tasks in uncannily humanlike ways, while Siri is just a disembodied voice. This area of technology is progressing so quickly that many are revising their forecasts regarding when AI systems will not only leave their uncanny valley abodes but also pass the Turing test. Consider a relatively new area of research, artificial consciousness, also known as machine consciousness or synthetic consciousness. These terms tend to refer both to AI and robotics, or cognitive robotics, which just means a robot that learns. As mentioned, this area of research is a hotbed owing to the disruptive forces causing advances in computing both in terms of storage capacity and processing capability. Advancements in cloud computing also offer viable and efficient tools for the development work. Both computing hardware and software are available to facilitate the development of AI solutions. AI methods such as machine learning and deep learning can give software and the specially made hardware that it runs on the ability to learn from vast amounts of data it collects. It can then use what it has learned to behave and make decisions in humanlike ways with- out getting too concerned with the definition of consciousness, which is many years away.
AI and Machine Learning
Within the broad field of AI, ML will have the most immediate impact. It has the potential to enable intelligent decision-making either in support of human intelligence or in place of it. Businesses will use ML to perform tasks to achieve a level of accuracy and efficiency beyond the capabilities of human workers. But putting decisions in the hands of intelligent machines has pro- found ethical and legal implications. (We will explore the legal implications in Chapter 7.) Although AI/ ML already make intelligent interventions on behalf of humans (such as voice-activated personal assistants), there is obviously much AI work to be done before machines are given full agency. The insights generated by ML will help businesses better understand customer expectations and market trends, enabling automated, personalized engagements. AI/ML will help in the creation of new goods and services, de- signed to meet the demands of modern consumers. AI/ML will empower business operations through analysis and strategic input. In the automotive industry, ML is the driving force behind autonomous vehicles. It can help the telecommunications industry identify and address network faults and enable financial services institutions to profile consumers more accurately. AI/ML will control customer service chatbots, provide marketing insights, identify cybersecurity vulnerabilities, enable personalized products and services, and facilitate an attorney’s ability to implement smart contracts. As we shall see, AI/ML combined with the IoT and blockchain provides significant potential for a historical transformation. There’s little doubt that the impact of AI on the business enterprise will be profound. So why aren’t we seeing more ground-breaking ML-powered products, services, and business models hit the market today? The answer, as with the IoT, is that maximizing the business benefits of ML is more challenging than it seems. To get from proof of concept (PoC) to full-scale production implementations will take years. SQL, the most popular data store to date, surfaced in the early 1980s but took nearly 10 years to become the data store of choice. It has remained so for more than 30 years. As with all new technologies, it represents an incremental process. To exploit the true value of AI/ML in the real world, the enterprise must do the following:
- Recognize opportunities for AI combined with the IoT and blockchain and off chain data (such as supply chain as well as other applications), which will yield a strong enough potential return on investment to spend on initial development efforts.
- Attract, develop, and retain talented multidisciplined developers to build platforms and applications that integrate AI and ML with the IoT and blockchain.
- Foster AI, begin to accumulate and store data both internal and external, and structured and unstructured, and integrate it with IoT sources and blockchain implementations.
- Put aside some budget to develop PoC’s for new and applicable use cases.
- When mature and where applicable, apply AI combined with IoT and blockchain to existing infrastructure and capabilities.
- Build a team of tech-savvy attorneys and financial staff to understand the emerging global legislation and regulation around AI.
- Consider the ethical and legal issues regarding implementations of AI combined with the IoT and blockchain before releasing it for public consumption.
Initially, implementing and exploiting AI in the enterprise has proven challenging. But the benefits of doing so are clear. Businesses have little choice but to find ways to infuse their business models and processes with ML, or risk falling behind their more agile competitors.
IoT: The Neurons and Senses
In our analogy, IoT is the human nervous system. The human brain supported by the nervous system comprises billions of connected neurons. IoT consists of billions of connected physical devices. With respect to IoT, these devices are connected to the Internet and they collect and share data. Pretty much any physical object can be transformed into an IoT device if it can be connected to the Internet and controlled that way. The central nervous system has a protocol network similar to the IoT network, which sends prompts to and from itself using neurons called dendrites and axons. Dendrites, as shown in Figure 1–10, bring information to the cell body, like a message from AI or smart contracts application. Axons take information away from the cell body, like the IoT device formulates a response to AI or smart contracts application. Information from one neuron flows to another neuron across the synapse. Our IoT devices are like components of the peripheral nervous system. They relay information like nervous system axons to and from the IoT server. By delivering messages via the IoT network, they provide us AI with status updates about our devices and their states. In a biological neuron, the dendrites receive inputs, which are summed in the cell body and passed on to the next biological neuron via the axon, as shown in the figure. Similarly, IoT platforms receive multiple inputs, apply various transformations and functions, and provide output.
From a seemingly straightforward biological system emerges something much more profound: a brain and neural network that can develop works of art, play and hear music, cook and taste wonderful food, perform great athletic feats, and so much more. When engineers re-create this biological system electronically, AI will emerge. This process is accelerating as we speak. Today, IoT sensors, with more than 23 billion connected devices around the world, are recording new data. It is predicted that there will be 75 billion connected de- vices by 2025 (see https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide/). So how do we identify, locate, and connect all the things in the IoT? The Internet Protocol (IP) provides an identification and location system for computers on networks and routes traffic across the Internet. IPv6, currently the most recent version, was developed to deal with the long-anticipated problem of IPv4 address exhaustion. For IPv4, this pool of ad- dresses is 32-bits (232) in size and contains 4,294,967,296 IPv4 addresses. The IPv6 address space is 128-bits (2128) in size and contains 340,282,366,920,938,463,463,374 ,607,431,768,211,456 IPv6 addresses. This should provide enough IP addresses for every device the world, and it is a necessary step for the IoT to scale. The area of IoT that was initially most interesting to business and manufacturing is machine-to-machine (M2M) communication, but the emphasis is now on filling our homes and offices with smart devices, transforming IoT into something that’s relevant to larger base of consumers. As mentioned, there are already more connected things than people in the world. Occasionally known as the Industrial IoT, the benefits of the IoT to business depend on the implementation. The key is that enterprises should have access to more data about their own products and their own internal systems and should therefore have the ability to make productive changes as a result. Manufacturers are adding sensors to the components of their products so that they can transmit back data about how they are performing. This can help companies identify when a component is predicted to fail so that it can be replaced before it causes damage. Companies can also use the data generated by these sensors to make their systems and their supply chains more efficient. Uber, for example, uses each driver’s mobile phone data (see https://venturebeat.com/2018/09/05/uber-now-uses-a-phones-sensorsto-detect-if-youve-been-in-a-crash/) to verify user feedback and send alerts regarding a crash or accident. Enterprise use of the IoT can be divided into two segments: industry-specific offerings, such as sensors in a generating plant or real-time location devices for healthcare, and IoT devices that can be used in all industries, such as smart air conditioning or security systems. Many of these innovations could have major implications for our personal privacy as well. Security and privacy are issues with the IoT as well as AI. Sensors often collect extremely sensitive data, such as what you say and do in your own home. Keeping that secure is vital to consumer trust. IoT bridges the gap between the digital world and the physical world, which means that hacking into devices can have dangerous real-world consequences. Hacking into the sensors controlling the temperature in a power station, for example, could trick the operators into making a cata- strophic decision; taking control of a driverless car could also end in disaster. And badly implemented IoT products could easily open up corporate and government networks to attack by hackers. With regard to today’s IoT, more things could yet be connected beyond what current users have done with cars, supply chain, and medical hardware. The potential connections extend to the edge of infrastructures as well as all things and people. As networks expand with the enabling of new components, we will see existing businesses with their fingers on the pulse of technology as well as entirely new businesses capitalizing on emerging opportunities to provide services and new modes of operation. Enabling real-time communications connections with or within nearly any thing will require reductions in power consumption. The goals will be to cut the energy costs of logic and the energy costs of the connections themselves, including the sensors that connect to and measure the physical world’s at- tributes and the near-zero-power radiofrequency identification (RFID) chips that wirelessly connect that information to the outside world.
IoT Components
Every tech revolution is propelled by the emergence of a new class of enabling components. The progress in compute-communicate energy efficiency during the past 50 years is unprecedented. That said, we need additional gains in power efficiency for logic and data exchanging wireless connections to pro- vide coverage so that the edges of the networks can be attached to individual components inside any machine to the any components of the real world around us.
IoT Networks
The architecture of the IoT ecosystem requires networks to transport data as information back and forth to remote users and devices using the cloud (see Figure 1–11). We need new API standards, protocols, and designs to operate within the current and perhaps newly designed cellular 5G networks. As these new IoT designs and standards emerge for transporting data, they will be developed to optimize resources to use a small fraction of the standard cell bandwidth. These new low-power standards will require lower speeds and power chips. Entirely new networks will emerge to be optimized specifically for the different characteristics of IoT connections — low power, low bandwidth, and use of frequencies with immunity to interruptions and tampering. This is very important for gathering information from and controlling real-time activities in the physical world (such as for autonomous vehicles), where there can be no tolerance for dropped connections.
IoT Devices: Sensors and Semiconductors
IoT devices can also actively measure attributes and states of a thing or person, such as temperature, vibration, velocity, location, personal health metrics, and so on. Today’s sensors are small, relatively inexpensive, and able to detect and measure using very little energy. The accelerometer in a mobile phone, for example, detects when we tilt the device. The opportunities for real- time monitoring and diagnostics are critical to emerging real-world industries, especially supply chain applications. These embedded chips can obtain power from the surrounding environment: vibration, noise, light, heat, and even ambient RF fields create energy in our environment. The sensors and semiconductor logic are already here, are relatively inexpensive, and are deployed to make connected cities, homes, and vehicles a reality. Semiconductor sales into IoT applications are running at $18 billion a year and growing at a rate of 20 percent annually (see https://www.eetimes.com/document.asp?doc_id=1330422).
IoT Integrated Circuits
The core of the IoT requires a near-zero-power integrated circuit (IC) and near-zero-power RFID chip. The chip is enabled by other devices that wirelessly send it power. The power is initiated from an external source, or reader, that beams radiofrequency energy at an unpowered chip, which then animates the chip so it can send a radio signal back to the reader. An RFID chip needs no battery, wires, or energy harvester on board. (For example, consider the card-enabled toll system used in vehicles.) RFID chip technology is well-suited for determining a thing’s identity, location, and authenticity. RFID is useful for tracking and monitoring food spoilage and safety and for tracking boxes in warehouses or trucks in the world’s supply chains. To this end, companies such as Impinj manufacture next-generation ultra-efficient RFID chips, the associated readers, as was well as cloud-based software and analytics to facilitate the utility of the associated data flood. Other RFID companies include NXP Semiconductors, Alien Technology, AMS, Phychips, and Zebra Technologies.
IoT Message Protocols
IoT devices work by pulling data from users either through input devices such as touch screens or sensors used for motion detection, temperature, humidity, pressure, and so on. This data is sent to the data servers for storage and processing, and the resulting information is provided to the end user de- vices for analysis and control. IoT and connected devices use different communication and messaging protocols at different layers (see Figure 1–11). The selection of the protocol used during development of an IoT device depends on the type, layer, and function to be performed by the device. Message Queuing Telemetry Transport (MQTT) and Constrained Application Protocol (CoAP) are two of the widely used communication protocols for the IoT application layer. MQTT is an M2M publish-subscribe–based messaging protocol that is used to communicate device data to the servers. Its main purpose is to man- age IoT devices remotely, especially when a huge network of small devices needs to be monitored or managed via the Internet, such as parking sensors, underwater lines, energy grids, and so on. MQTT messages are sent asynchronously through publish-subscribe architecture. The messages are encapsulated in several defined control packets, which are designed to minimize the network footprint. MQTT features a very low bandwidth using TCP/IP with no metadata and three quality of service (QoS) layers depending on the importance of the data being transferred. CoAP uses the familiar REST (Representational State Transfer) design pat- tern, which helped make the web so successful. Using REST, servers make re- sources available at uniform resource identifier (URI), the string of characters that unambiguously identifies a particular resource — the https://www.whatever. Clients access the resources using methods including GET, PUT, POST, and DELETE. CoAP features a REST client-server document transfer using HTTP for web integration and metadata to differentiate document types. CoAP uses the User Datagram Protocol (UDP), an alternative communications protocol to Transmission Control Protocol (TCP), to establish low-latency and loss-tolerating connections. CoAp uses Datagram Transport Layer Security (DTLS) to provide security for UDP applications, which enables them to communicate while preventing eavesdropping, tampering, or message forgery.
IoT Servers and Data Stores
Network infrastructure expansion emerges from and is driven by data traffic. M2M connections are the fastest growing contributors to total overall Inter- net traffic. By 2020, machine-related data is anticipated to grow 600 percent (see https://www.itu.int/dms_pub/itu-r/opb/rep/r-rep-m.2370-2015-pdf-e.pdf). Growth of the IoT will require new enterprise-class datacenters (see http://datacenterfrontier.com/internet-things-may-create-new-breed-data-centers/). The IoT requires its own servers and accompanying operating system (OS) — that is, an OS optimized monitoring events and attributes of things such as packages in warehouses or drones. The IoT server OS (see Figure 1–12) will require software for security to authenticate and authorize users and devices that access and provide data. IoT servers will contain rules engines that provide the ability to build intelligence in the IoT network. The server facilitates the ability of a process broker to send different commands, alerts, or data to the different devices and user clients based on the messages received by the broker. The rules can be defined based on type and topic of message or timer rules to send commands, alerts, or data messages to the topic based on a given date and time. The IoT server will provide systems management for the management of shadow data stores, registration of devices, and device data analytics. The M2M Internet will require security solutions embedded in the chips at the edges of networks using end-to-end encryption inside the IoT chip and synchronized with its network via the IoT server.
IoT servers will evolve to perform all of the necessary services required.
There is much more to know about many objects other than location in real
time. In fact, an entire constellation of physical attributes of things is the realm of IoT servers, where we expect the knowledge of things to become useful and actionable. The IoT server accelerates the opportunities to manage physical events in real time, in autonomous vehicles as well as in manufacturing and supply chain systems. Data about things that are collected and analysed requires information and instructions that flow back to where the things reside in the physical world. When it comes to the IoT and safety, we must consider the speed of light, which determines how long the round trip is going to take, from acquiring information at the edge of the network, to the cloud-based server nodes, and then back to the edge with the result or instruction. Data centres and IoT servers will necessarily have to move close to edges of the network. Remote sites that were chosen to use for inexpensive real estate and power will not support certain applications. Hence, we can expect a proliferation of edge data centres and IoT servers. Telecommunication companies will be important partners as data centres functions push out toward the IoT edges. That said, we are still in early days for the IoT, but both business and venture bets point to a surge in growth once new technologies take hold.
IoT Trends
As these components mature and become cost effective, the IoT will transform a world of things into a world of data about things. The data will be aggregated and securely stored on the blockchain. As discussed, practically anything can be equipped with a sensor and made smart. The public sector, manufacturing, transportation, automotive, consumer goods, and even health- care will never be the same once the IoT augmented by AI and BC takes hold. The combination of these technologies provides opportunities to extract new data and improve existing business processes, bring innovative products and services to market faster, and gather new information on consumer trends and preferences. A host of IoT smart household appliances and personal electronic devices will transform the consumer goods industry to create a new user experience and provide retailers with a massive amount of useful data. But the impact of the IoT will be felt far beyond the home.
With AI and deep learning, it is possible to train machines by making them experience different situations. As the training algorithms get better, they can be effectively used in IoT. IoT operationally generates a lot of data, including blockchain and some off-chain repositories such as SQL and other key- value pair data stores such as NoSQL to store associated data. SQL databases predefine the data structure in the database as a series of tables containing fields with well-defined data types, such as string, integer, decimal, and date. Exposing the data types allows for a number of optimizations. In contrast, key- value, or NoSQL, data stores have different fields for every record. This offers flexibility and more closely follows modern concepts such as object-oriented programming. Using AI and IoT platforms, we can collect and analyse all forms of data. The data is the most important factor, because it enables us to recognize meaningful patterns, and by analysing these data stores, AI and the IoT can make decisions that are difficult even for humans. With full-fledged AI, the capability of IoT platforms will be exponential. To achieve the full benefit of the IoT, not only is the speed of analysis important, but also the accuracy of analysis. The decisions made by IoT platforms will improve over time as the AI platform learns with experience. Predictive maintenance is an area of interest to many manufacturers. The endeavour is always to reduce downtime and increase machine availability without increasing expenditures on maintenance. For example, SK Innovation, a leading South Korean oil refiner, is using AI and the IoT to predict the failure of connected compressors. And AI-based systems have helped save Google 40 percent of data centre cooling costs by predicting the temperature and pressure, thereby limiting energy usage. All major IoT platforms such as Azure IoT and AWS IoT, now use ML for predictive capabilities already incorporated, but going forward the deep-learning capabilities will become more commonplace. In the automotive industry, the IoT is helping manufacturers make connected, autonomous, shared, and electric (CASE) vehicles a safe, workable reality. Insurers can use the data produced by these connected vehicles to monitor driving habits, develop personalized coverage options, and process claims accurately. And manufacturers can create connected smart factories capable of monitoring equipment health, minimizing production costs and downtime, and maximizing productivity. There’s clearly great potential in IoT devices generating in excess of 14 zettabytes (14 billion gigabytes) of data every year. Gartner predicted (https://www.zdnet.com/article/iot-devices-will-outnumber-the-worldspopulation-this-year-for-the-first-time/) that by 2020, consumers and businesses would spend $2.9 trillion on devices. In addition, spending on IoT services in 2017, covering professional, consumer and connective services, reached $273 billion dollars. Thus far, because of a lack of trained developers and immature development tools, IoT projects have been difficult to implement, and they have thus been underutilized with only a fraction of that data currently analysed and put to practical use. To realize the business benefits of IoT, the enterprise must do the following:
- Create an ecosystem of IoT-enabled devices.
- Use data science to source, store, and manage all the potentially relevant data.
- Develop analytics and ML capabilities.
- Create new IoT applications that exploit the data collected using analytics and ML.
- Integrate the IoT into existing applications and workflows.
- Deploy end-to-end security to avoid tampering and interruption of data flow.
- Monitor, manage, and iteratively adjust the entire value chain.
As you shall see in this book, with so many opportunities for the IoT, the challenge for the enterprise is to build on existing technology, infrastructure, and capabilities to accelerate time to value and to minimize the cost and complexity of IoT implementation. The benefits for those that succeed will be significant indeed.
Blockchain: The Memory
IoT feels, and AI thinks. Blockchain, meanwhile, remembers. Blockchain is known as the technology that underpins bitcoin and other cryptocurrencies. In fact, it provides much more functionality and value than that. As mentioned, blockchain is a digitized, encrypted, decentralized database/ledger of trans- actions. The transactions are replicated across multiple computers and linked to one another to make any tampering with records virtually impossible. This immutable way of managing records eliminates the need for any central entity managing the transactions. Think of blockchain as the foundation of high-trust computing; it brings reliability, transparency, and security to all manner of data exchanges whether financial transactions, contractual and legal agreements, or changes of ownership. A blockchain uses a distributed peer-to-peer (P2P) net- work to keep an unalterable record of every exchange, removing the need for trusted, third-party intermediaries in digital transactions. The resulting value is faster processes, real-time transaction visibility, and reduced costs across every industry. From a technical point of view, blockchain is a distributed, thrustless, transparent, immutable, consensus validated, secured, cost reducing technology (see Figure 1–13). The blockchain is distributed because a complete copy lives on as many nodes as exist in the system. The blockchain is immutable because none of the transactions can be changed. The blockchain is consensus validated (for example, in the bitcoin space) by the miners who are compensated for building the next secure block or validated by a consensus algorithm, which is described in Chapter 2.
Blockchain Types
There are three primary types of blockchains that serve different purposes and provide unique benefits: public blockchains, consortium blockchains, and private blockchains.
Public Blockchains
Public blockchain creators envisioned a blockchain available to all, where transactions are included if and only if they are valid, and where everyone can contribute to the consensus process. The consensus process determines what blocks get added to the chain and what the current state is. On public blockchains, instead of using a central server to store data, the blockchain is secured by cryptographic verification supported by incentives for the verifiers (miners). Anyone can be a miner to aggregate and publish those transactions. In the public blockchain, because no user is implicitly trusted to verify trans- actions, all users follow an algorithm that verifies transactions by committing software and hardware resources to solving a problem by sheer force — that is, by solving the cryptographic puzzle to find the next block. The miner who reaches the solution first is rewarded, and each new solution, along with the transactions that were used to verify it, forms the basis for the next problem to be solved. In 2019, proof of work (PoW) and proof of stake (PoS) were the most commonly used verification concepts.
Consortium Blockchains
A consortium blockchain, such as Corda or Hyperledger Fabric, is a distributed ledger in which the consensus process is controlled by a preselected set of nodes — for example, a consortium of nine financial institutions, each of which operates a node, and of which five (as with the U.S. Supreme Court) must sign every block in order for the block to be valid. (See https:// docs.corda.net/releases/release-M8.2/key-concepts-consensus-notaries.html for more on the consensus.) The right to read the blockchain may be public or restricted to the participants, and hybrid routes may exist, such as the root hashes of the blocks being public together with an API that enables members of the public to make a limited number of queries and get back cryptographic proofs of some parts of the blockchain state. These sorts of blockchains are distributed ledgers that may be considered partially decentralized.
Private Blockchains
The types of openness and pseudonymise that exist on a public blockchain usually aren’t suitable for transactions among business entities. For numerous reasons, including regulatory and security concerns, most organizations need to know who they’re dealing with, and they must also ensure that unauthorized participants cannot gain access to transaction data, which could contain sensitive corporate information. Even within the world of private blockchains, it is important to consider which degree of privacy is necessary and useful. For example, a strictly private blockchain run and maintained by a single entity within a single organization has limited use. Blockchain networks become more valuable when more organizations participate to share and transact data. But these organizations can participate only when they have been granted permission to do so. This type of permissioned network among a known set of participants is a private blockchain. In a private blockchain, consensus is usually achieved through a process called selective endorsement. It is based on the concept that network participants have gained permission to be there and that the participants involved in a transaction are able to confirm it. A blockchain using this type of consensus can be built with a more modular architecture, and it can allow for greater transaction volume at faster speeds. Endorsers are determined by the governance and operating rules for the network. As with all blockchains, private blockchains employ the recording mechanism of grouping transactions into blocks and linking blocks together into an immutable chain. But in a business context, when it is necessary to protect sensitive corporate information and customer data, it is important to secure the blockchain with additional measures. Private blockchain networks do the following:
- Ensure separation between entities, providing horizontal protection.
- Prevent attacks through privileged user accounts, providing vertical protection.
- Protect encrypted data by securing the cryptographic keys.
The performance of a private blockchain depends on the design of the network and its systems infrastructure. Because selective endorsement doesn’t require anywhere near the amount of power that PoW requires, a private blockchain can process much higher transaction volumes at higher speeds with far fewer computational resources. Most private blockchains start small as a single new business ecosystem, but they need room to grow to handle the number of internal and external partners that will be involved in the network.
Comparing Blockchain Types
It is important to draw a distinction between public, consortium, and private blockchains. Even “old school” distributed ledger adoptions that prefer a traditional centralized system can get the addition of cryptographic auditability attached. As compared to public blockchains, private blockchains have a number of advantages: The private blockchain operator can change the rules of blockchain. If it is a blockchain among financial partners, if errors are discovered, they will be able to change transactions. Likewise, they will be able to modify balances and generally undo anything, because there is an audit trail of all transactions. In some cases, this functionality is necessary — for example, with a property registry if a mistaken transaction is issued or some nefarious person has gained access and made himself the new owner. On a private block-chain, transactions are less expensive, because they need to be verified only by a few nodes that can be trusted to have very high processing power. Public blockchains tend to have more expensive transaction fees, but this will change as scaling technologies emerge and public-blockchain costs decrease to create an efficient blockchain system. Nodes can be trusted to be very well connected, and faults can quickly be fixed by manual intervention, allowing the use of consensus algorithms that offer finality after much shorter block times. Improvements in public blockchain technology, such as Ethereum PoS, will bring public blockchains much closer to the “instant confirmation” ideal. The latency difference will never disappear, because, unfortunately, the speed of light does not double every two years as posted by Moore’s law, which we reviewed earlier in the chapter. If read permissions are restricted, private blockchains can provide a greater level of privacy. Given all of this, it may seem like private blockchains are unquestionably a better choice for institutions. However, even in an institutional context, public blockchains still have a lot of value. In fact, this value lies to a substantial degree in the philosophical virtues that advocates of public blockchains have been promoting all along, among the chief of which are neutrality and open- ness. Public blockchains are open, and therefore they are used by many entities, and this provides networking effects. If we have asset-holding systems on a blockchain and a currency on the same blockchain, we can cut costs to near-zero with a smart contract: Party A can send the asset to a program that immediately sends it to Party B, which sends the program money, and the pro- gram is trusted because it runs on a public blockchain. Note that in order for this to work efficiently, two completely heterogeneous asset classes from completely different industries must be on the same database. This can also be used by other asset holders such as land registries and title insurance companies.
Blockchains and Smart Contracts
The term “smart contract” was first coined in 1994 by lawyer and cryptography researcher Nick Szabo, as he theorized on the future of e-commerce in the light of the newly born Internet. Szabo argued that contracts and legal agreements follow Aristotelian syllogisms — that is, if some condition is true then we can perform some function. So, for example, “if Buyer fulfils such and such conditions, then Seller is obliged to transfer the asset,” in paper con- tracts, could be replaced with computer programs that automatically execute the terms of an agreement. Computer code is precise and freed from fallible human interpretation. Moreover, it is tested and validated before it is implemented to a production status. In contrast, traditional contracts may contain flaws that cause disputes, resulting in lost time and money. Furthermore, contracts and agreements are mere words on paper; without an authority willing to enforce them, they’re rendered useless. Code, on the other hand, makes our modern world operate efficiently. Pre-programmed instructions can move money around, lock doors, and forfeiture payments in escrow, without one single policeman or bureaucrat signing an order or threatening punishment. The problem with code, however, is that it can be hacked. If we want a self-enforcing code contract to govern funds and property, this code would have to be stored on a computing platform that exclusively controls these assets. These contracts could never be trusted to be free from unauthorized alteration or attacks. For this reason, as time progressed and e-commerce became a common place practice, Szabo’s smart contracts remained an interesting but infeasible concept, reserved for future generations to iterate on. The blockchain, however, solves a problem raised by Szabo and his peers. Code stored in a blockchain system is freed from the need to physically reside in one single location, and hence it is not under the influence of the owner of said location. Furthermore, thanks to the public nature of blockchains and their consensus mechanisms, unauthorized alteration of such code is close to impossible. The term “smart contract” as it is used in the context of Ethereum, Qtum, and other Turing-complete blockchain platforms, is acceptable but slightly mis-leading. Ethereum’s smart contracts, written in a language called Solidity, are essentially code, not dissimilar from other programming languages. Although it is true that smart contract code mainly deals with transactions between agents in a blockchain system, and hence is able to describe and execute some terms of an agreement between such agents, smart contracts themselves are not actual contracts. A contract is a voluntary arrangement between two or more parties that is enforceable by law as a binding legal agreement. A bound contract generally re- quires an offer, an acceptance, consideration, and a mutual intent. Smart con- tract code doesn’t meet most of these criteria, especially the requirement to state an offer clearly, nor does it express acceptance and willingness to be bound by law, so it is not enforceable in any legal context. Third parties, es- crow services, and oracles (off-chain functionalities and sensors) could of course be brought into the picture, but they would have to be orchestrated in a way that enables users to interact with them easily and understanding what they’re doing. At the moment, smart contracts alone, as they are introduced with real-existing blockchain platforms, do not meet these requirements. Nevertheless, despite the relative crudity of the technology, smart con- tracts are by all means a ground-breaking innovation and will most probably serve as the cornerstone of future digitized commerce. With the advent of Turing complete blockchains and the IoT, smart contracts can safely and swiftly move assets around, interact with physical objects, and lead to the automation of many business-related processes that currently demand vast human re- sources and time. But to serve as a substitute for traditional paper contracts and the legal relations they dictate, these automated processes have to be orchestrated intelligently and flexibly, and they must be embedded in an interface that enables humans to make sense of them.
Furthermore, legal formality can’t be disregarded. The need to use the court system to provide dispute resolution services will be mitigated to some degree by efficient smart contract–based agreements. For digital smart con- tracts to effectively replace traditional forms of paper-contracts, they will have to meet some requirements. Traditional legal documents are complicated enough and often require trained professionals to be understood correctly. Smart contracts supported by AI will remedy this situation using specialized natural language to create code. The UX/UI of a smart contract should be as easy to use and understand as a well-developed software application. To be used as a legal instrument, smart contracts must be structured so that they serve as proof of acceptance of certain conditions and must include the description of an offer, its acceptance, and the mutual intent to be bound by its terms. This also entails the management of digital identities and signatures in an officially acknowledged and immutable fashion. To accommodate for the realities of the business world, smart contracts must be open for management, adaptation, and renegotiation. A party to a smart contract should have the option to waive the other side’s obligations at will or to refrain from their own obligations if the ever-evolving circumstances dictate that necessity. A smart contract has to encompass the entirety of a con- tract’s life cycle, allowing its course of action to branch into different scenarios according to user input. A smart contract has to allow users to confirm that the other side has or hasn’t met their obligations, and/or it must serve as undeniable proof that said obligations have been satisfied. For smart contracts to outcompete their analogue counterparts, they’ll have to thoroughly exploit the advantages of code over paper-based text. Smart contracts could and should be analysable by an AI agent, providing users with insights regarding their best course of action, the same way a human legal counsel would provide such ad- vice. In addition to the smart contract code, a legal document is created by the AI agent, written in natural language, which can be presented in court if this be- comes necessary. The ability to have an AI agent translate human-readable agreements into code that can be managed and augmented throughout the contract’s life cycle is being developed as of this writing (see https://www.accordproject.org/).
Benefits of Using Blockchain with the IoT
To summarize, the benefits of using blockchain with the IoT are trust, traceability, and security. Blockchain’s decentralized, open, and cryptographic nature enables people or entities to trust one another and transact P2P. As autonomous systems and devices interact with one another, the IoT transactions are exposed to potential security risks. Blockchain technology provides a simple, cost-effective, and permanent record of decisions made and communicated. Data transactions take place between multiple networks owned and administered by multiple organizations. Blockchain can provide a permanent, immutable record so that custodianship can be tracked when data or physical goods move between points in the value chain. Blockchain records are by their very nature transparent — activities can be tracked and analysed by anyone authorized to connect to the network. Security is of critical importance for IoT networks. Imagine a scenario where the hackers are able to attack the smart city network, thereby not only bringing down all the interconnected processes but also exposing personal data. If the data is exchanged over blockchain network, the overall security of the IoT network is greatly enhanced. Finally, blockchain or DLT systems use smart contracts or applications that are automatically executed when the conditions are fulfilled. Using smart contracts, the actions can be executed across various entities in the supply chain automatically, in an immutable manner without worrying about the disputes.
Use Cases for the IoT with Blockchain
The use of the IoT with blockchain is already gaining momentum. For example, IBM’s Watson IoT platform integrates well with the blockchain backed by Hyperledger Fabric (see https://www.hyperledger.org), effectively combining the two technologies. Similarly, Azure IoT also offers good integration with Ethereum, Corda, and Hyperledger Fabric. A number of technologies use blockchain in conjunction with the IoT. Blockchain can be used to record and timestamp sensor data. This way, the data from the sensors cannot be manipulated and can be trusted by all the parties parties in the transaction. In smart cities, multiple entities are collecting and acting on data, which can be trusted easily by use of blockchains. Use of blockchain in smart cities can even enable citizens or entities to sell data and get paid via bitcoins. The IoT comprises multiple devices, and these devices are authenticated based on digital certificates, which render the devices vulnerable to security breaches. Blockchain can create the digital identity of the devices so that they cannot be manipulated. Also, information about the devices can be dynamically updated, leading to higher scalability. Provenance of a product can be established using the IoT and blockchain. IoT sensors can be placed on medicine packets to trace and record information as the packets move from the factory, to the distributor, to the retailers using blockchain’s distributed ledger. IoT sensors with blockchain can go a long way toward avoiding the problem of fake medicines in many emerging markets. Gartner estimates that blockchain could create $176 billion of value-added revenue by 2025, thereby revolutionizing the supply chain, enabling new business models, and disrupting existing ones. Blockchain will prove to be a game- changer in numerous industries and sectors, such as financial services, insurance, e-commerce, healthcare, and human resources — essentially, anywhere digital information is exchanged. In the consumer goods sector, blockchain coupled with IoT sensors will provide transparency across the supply chain through asset tracking to enhance accountability, streamline product recalls, and improve consumer trust. In education and research, it will help to ensure that intellectual property rights are recorded and upheld. In finance, blockchain is the fuel powering the financial technology (fintech) revolution. As with IoT and AI, you’d expect blockchain to have been adopted and implemented more widely by now, particularly considering the media hype regarding blockchain and cryptocurrencies. But beyond those agile fintech start-ups, it’s still a comparative rarity.
Barriers to Blockchain Adoption
The problem is familiar: perceived risk and complexity stand in the way of widespread adoption. Barriers to blockchain adoption include the following:
- The cost and availability of compute resources affect its widespread use, though advances in cloud computing will solve this.
- The cost of blockchain miners affects its widespread use. The development of new consensus algorithms will correct and ultimately eliminate the miners.
- Blockchain contracts are currently untested in court. Changes and new curriculum in law schools will foster the use and ultimate testing of these smart contracts.
- Blockchain must be integrated with existing off-chain data stores and systems of record. Blockchain is undoubtedly transformative. In fact, much of its impact has yet to be explored, even on a theoretical level. But before the enterprise can discover the outer reaches of blockchain’s potential, these stumbling blocks must be overcome.
The Confluence of AI, IoT, and Blockchain Is Real
Today, even the most advanced technologies are usually reactive rather than proactive. Think of virtual assistants such as Siri, Alexa, and Cortana. Give them a command and they’ll respond to it, by playing a tune, ordering a product you’ve requested, or placing a call on your behalf. But when powered by transformational technology, these virtual assistants will become much more proactive. In the near future, your virtual assistant might observe that you’re running low on a particular product and suggest that it place an order for you, or it could tell you how you can find the best value by adjusting your purchase habits. IoT data from your refrigerator will determine that your almond milk is running low. ML will work out which retailers sell your preferred brand. And blockchain will ensure that the transaction is processed securely, and you get exactly what you paid for. Consider U.S. healthcare in its current form, whereby the patient decides when she needs to see her doctor. This happens generally only when visible symptoms have appeared, or an accident has occurred. The patient must schedule an appointment, and then remember the pertinent details of their medical history during the visit. Human beings are, of course, fallible and forgetful. So, imagine a nation of sensor-equipped patients, in which ML monitors IoT sensor data and can determine, at an early stage, when something has gone wrong. The patient’s virtual assistant can cross-reference her calendar with her doctor’s calendar and schedule an appointment automatically. And when the patient arrives, blockchain will ensure that she has a secure, accurate, digital medical history for the doctor’s reference. With respect to cybersecurity, new, more-stringent regulations such as the GDPR in Europe, the mutating threat of cybercrime, and the increasing value and proliferation of consumer data has made cybersecurity a universally pressing concern. But even here, the IoT, AI, and blockchain can have a transformational effect. These technologies largely remove the human element from cybersecurity and similar processes. When practically everything is sensor-equipped, log and audit data can be collected in a centralized repository. ML can analyse this data far more quickly and accurately than any human could, make logical decisions, and take autonomous action. And any and all critical evidence is securely recorded via blockchain. This system effectively bypasses the most common causes of data breaches — carelessness, human error, and malicious intervention.
In Conclusions
These three transformational technologies will bring change in our professional and personal lives, in the companies we work for, and in society as a whole. But their impact doesn’t belong to a vague and distant future; as we have hopefully demonstrated, most of these capabilities are available today. The IoT, AI, and blockchain complement one another well and can potentially remove some of the drawbacks of these technologies when they are implemented in isolation. The idea of these technologies working in tandem is not new, but it still needs time and effort to mature, with efforts driven by the passion of a new wave of developers. We have yet to envision the full impact of he confluence of these technologies. Now is the time for visionaries to wake up to the potential of this trinity and start to look at creative ways of using them for appropriate solutions, which are not limited by technology solutions, but only by our imaginations. In summary, here are items to review and research when preparing for transformational technologies:
- Focus on the trinity: the IoT, AI, and blockchain.
- Understand that these technologies need and use increasing amounts of data to operate efficiently and accurately.
- Embrace transformational technologies and realize that attaining full potential will require a change in business processes or models.
- Integrate transformational technologies across the entire enterprise, giving your organization the efficiency and agility, it needs to compete.
The intuitive ability of IoT combined with the cognitive power of AI and ML and the immutable memory of blockchain will ultimately disrupt and re- build business models and processes to change every industry, sector, and line of business. The technologically enabled future has arrived, and it’s time to start a plan for this future. This book will provide you with a comprehensive map and interest points to help you determine what you need to make the trip to AI, the IoT, and blockchain success.