What fuels research and development? Innovation! The sheer act of looking for something new stems from trying to do something better or fix a current problem, what other reason would there be to develop anything new or do any further research? We’ve written an article below of five new technologies coming of age this decade.
The internet of things
The Internet of Things (IoT) is a term that refers to the network of increasingly “smart” – or internet connected – everyday objects, often mentioned in relation to the “smart” or automated home; thermostats, refrigerators, security cameras; but with possibilities extending far beyond the home to encompass entire cities or even countries, and entire industries; energy, infrastructure, security, healthcare, transport, manufacturing, agriculture and defence are just a few areas that the Internet of Things has immense potential to provide innovation and improvement. One common application, within the aforementioned concept of the “smart home” is the use of internet connected thermostats and lightbulbs, allowing the consumer greater control over their energy usage, and allowing heating, hot water or lighting for example, to be switched on or off from a smartphone, or, automatically – based on a consumer’s usual routine, without the need for any input from the consumer themselves. An often-used example is a user being able to switch on their heating while on the way home from work.
The concept of the IoT is closely related to the “smart” concept – and encompasses a host of items with inbuilt electronics, sensors or actuators, and most importantly connectivity, allowing for greater – and direct – exchange of data with the wider internet. The use of Smart Meters in homes for water and energy billing is an example of an IoT type of technology already in widespread use in our homes. According to business research firm Gartner, this year the number of connected devices, across all sectors and technologies, exceeded 20 billion. Amazon’s AWS (Amazon Web Services) and Microsoft’s Azure Sphere are two of the main entities currently providing the technology behind this huge and ever growing network of connected devices, and while there are some ethical concerns around privacy and security for example, the technology only looks to grow further.
Artificial Intelligence allows machines to perform tasks hitherto limited to humans; for example, speech or image recognition. The most recognisable example of an AI application is probably the Virtual Assistant; the likes of Siri from Apple or Alexa from Amazon, or the Google Assistant being fairly commonplace in people’s homes right now. Again though, the impact of Artificial Intelligence is being felt far beyond the home, with applications as varied as medical diagnosis, robotics, electronic trading, transportation, military, healthcare in general, customer services, agriculture, or even music composition!
With the ultimate goal to create computers that can think and communicate as humans do being still relatively far away, the use of applied artificial intelligence will continue to grow into ever more sectors and uses.
Machine Learning, or ML for short, is a form of data analytics technology, directly related to artificial intelligence, that gives computer systems the power to learn automatically without being explicitly programmed, by using learning algorithms, and building models from ‘training data’ and supervised, unsupervised or ‘reinforced learning’. Having helped play a part in the huge advances made in artificial intelligence over the last few years, ML is a highly significant trend in technology.
The company Nvidia, known primarily for its computer graphics cards popular with gamers and creatives alike, has started using its knowledge in the field of ML, beginning to research how deep learning can teach robots to work alongside humans, supporting their work, by simply observing a person carrying out a task. Computer vision technology, required for applications such as self-driving cars, drones and delivery robots, is currently being developed with the use of ML technology. Facial recognition, with applications in security including bank verification, is a further use, and companies such as Google, Amazon and Facebook are heavily involved in the development of ML technology.
Blockchain is a technology that creates a public ledger to permanently, transparently and securely record transactions between two parties; essentially a growing list of verifiable records that, due to its decentralised and peer-to-peer nature, is resistant to modification by any party. Most known for being the technology that enables cryptocurrencies such as Bitcoin and Ethereum to function, blockchain has obvious applications in the financial industry, but its possibilities go much further, including possible applications in cybersecurity, banking – including in third-world countries without a traditional banking infrastructure in place – and management of supply chains; ensuring the safety of foods for example, or tracing the source of rare earth metals used in technology such as smartphones to ensure ethical business practices are observed.
All of the previous technologies mentioned will themselves rely on continuous innovation in cybersecurity to in order to remain secure and continue to defend themselves against digital attacks, as the hackers themselves are constantly innovating. Whether accessing or changing personal or sensitive data, extorting money from users or interrupting normal business activity, the threat from hackers is ever on the increase. And as our lives become more and more dependent on technology, and with the number of devices in the world now being greater than the number of people, the need for competent cybersecurity looks only set to increase.
Businesses, governments, regulatory authorities, transport networks and utility companies are all heavily dependent on their computer systems and as such at great risk from hackers and cybercriminals, and with technologies such as self-driving cars or internet-connected medical devices such as a pacemaker, the threat from hackers or other criminals could actually be physically dangerous or even deadly should the security of such a device be compromised.
Some of the big companies involved in IT services including Cisco, Amazon Web Services and Microsoft are becoming more focused on cloud-based security as needs evolve, and blockchain technology looks set to play a bigger role in future cybersecurity provision, with its decentralised nature offering robust protection of data from potential hackers.
Both the stakes and potential gains from investing in new technology are continuing to increase rapidly alongside the pace of development, driving investment and research in the sector, while the new technologies will themselves offer new opportunities for innovation, for example in cybersecurity. The need, and opportunities, to invest in research and development are set to continue to grow, and the UK government offer a number of tax incentives to encourage investment. For more information, have a look at our R&D portal on our website here.