![]() ![]() Users now have more computing power in the palm of their hand than many PCs of the mid to late 1990s. Smartphoneshave heralded the ubiquity of the computing age, with approximately 2 billion handsets globally increasing to 6.1 billion by 2020. Regardless of the popularity of wearables, they still lack a killer use case, hamstrung by screen size (if any), imprecise data measurements and inherent dependancy on a smartphone for increased functionality. ![]() More recently, Apple have thrown their hat into the ring with the Apple Watch which is positioned as a complement to the iPhone and priced accordingly. Pebble were the first to tap into latent demand from consumers for watch based fitness/activity trackers. Much like the early days of the Web, IoT may struggle to gain the oft discussed lift-off until there is a unified protocol that allows cross-device interconnection, we continue to wait.Īn associated technology Wearables have been hugely popularised in the last few years with fitness tracker companies such as Pebble, Jawbone and Fitbit. IoT experienced a malaise in the subsequent years though today is becoming increasingly more frequent within households through the use of devices such as Amazon Alexa, Chromecast, Apple TV and Nest. The philosophical mutterings of the technology have been around for decades, with the spark being ignited in the early 2000s through the use of RFID and NFC in devices. The Internet of Things which predicates the increased connectivity of network enabled devices within physical objects. Though the technology has reached prevalence once more, it’s unclear how much more weak-AI as opposed to complete-AI we will have to endure. Everything from predictive keyboards, personal assistants, trading to mass medical analysis. Today, AI has become a buzzword with a slew of startups promising enhanced user experiences through complex network computations. Though evidently progress has been made in machine learnings, deep learning and neural networks. AI has seen the goal posts moved consistently, with previous breakthroughs such as Deep Blue’s victory over Garry Kasparov seen as the pinnacle of computing to more Deep Mind’s most recent victory at Go. Now, just believed to be one of the many milestones to the consumerisation of this complex field. AI has perennially been 10 – 20 years away from breakthroughs of orders of magnitude larger than previously realised. Since then, the research and development has seen several AI winters, with no less than 5 periods of developmental solitude. Today, usage extends beyond personal and professional means with an increasing number of startups leveraging video to offer healthcare, banking services and customer care, amongst others.Īrtificial Intelligence has been evolving since the very early days of Alan Turing with the founding stakeholders of this research collaborating at Dartmouth in 1956. The prevalence of Wi-Fi, 4G and smartphones has greatly aided apps such as Skype, Apple’s FaceTime and Google Hangouts. End users’ lack of general adoption was evident, with issues around pricing, network coverage and usability. The adoption of video calls was slow to take off, much like the jittery wireless connections that the services ran on. Mobile video calls were initially popularised in Japan in the late 90s/early 00s and later expanded through the West with the advent of 3G/UMTS networks and increasing connection speeds. Expect it’s killer app to be a few years away until it proliferates and integrates through smartphone usage. Currently, most VR headsets require a powerful CPU and GPU or games console to power the content, still making it a relatively niche product. Sega, developed the Sega VR headset in 1991 which was later released as the Sega VR-1 motion simulators prevalent in 1990s arcades. Today, virtual reality is being heralded as the next frontier of immersive technologies with the increased potential for sharing experiences, learning and enhancing visuals. Virtual Reality typifies this, with the technology becoming “mainstream” in the 1990s through the use of video games. The recent events at the World Mobile Congress in Barcelona serve to remind us that technologies of yesteryear can be re-introduced into the mainstream due to a confluence of factors. While to a certain extent this may be true, sometimes consumers are less willing to initially adopt said technological change. We’re often made to believe that the tech industry releases the cutting edge products of the future. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |