Fordway Blog

How cloud is evolving to support AI

[fa icon="calendar"] Nov 15, 2018 11:07:34 AM / by Richard Blanford

edge computing

Today’s move on cloud is the next logical step in the waves of centralisation and decentralisation that characterise IT. One moment we think the best place for intelligence in the network is at the edge, then technology changes and it moves to the centre. When client-server came along, terminals became PCs and the mainframe morphed into the database server.

Now, cloud is consolidating data centres and intelligence has moved to the centre, enabling organisations to benefit from economies of scale. However, we’re seeing a rapid growth in intelligent devices, such as robots in manufacturing, medical diagnostic systems and autonomous vehicles, up to and including self-driving cars – what you might term ‘intelligent client mark 2’. These devices need to process information in real time and so for them the latency of cloud is a problem.

Take an autonomous vehicle, which needs information on the changing obstacles around it, or a robot scanning fruit on a conveyor belt in a factory and picking off substandard items. They need to make instant decisions, not wait for information to transit six router hops and three service providers to reach the cloud datacentre and then do the same on the way back. If you think about it for a moment, it’s basic maths!

Having intelligence at the edge is vital for applications that seek to do things in real time. We're beginning to see the increased utilisation of robotics and ‘smart’ devices, with embedded systems to make use of artificial intelligence to train,  manage and optimise device performance. As the use of these technologies develops the need to pool and share selected information will increase.

Cloud still has many advantages. In the insurance industry, where actuaries have traditionally analysed massive amounts of data to enable underwriters to make policy decisions, the economies of scale provided by cloud processing offer significant advantages and there is no major benefit in moving intelligence to the edge.

As this example shows, cloud is good at scale, training and developing algorithms and large-scale data stores. Where appropriate, developers of AI applications can then bring the intelligence to make decisions to the edge device to enable it to act autonomously. An example would be a facial recognition system, where you would use cloud to store petabytes of data to enable you to train the system with many thousands of photos, and then load the algorithm developed into the camera control system so that the initial facial recognition is at the edge. The system can then revert to the data stored in the cloud if further confirmation is required.

 

 

Topics: Insider, Cloud, New Technology