The latest WWDC (Worldwide Developers Conference) aired this week and brought with it lots of exciting new releases!
For those of you who haven’t heard of it before, it is an annual event organised by Apple Inc that serves as a platform for Apple to showcase its latest software and technologies to developers and the wider Apple community.
As we expected, the main focus of the event was AI with Apple announcing its debut of Apple Intelligence, a new personalised AI system aimed at deeply integrating AI capabilities across its products and services.
Apple Intelligence and ChatGPT
A series of generative AI products and services were revealed on Monday including Apple Intelligence and a deal with ChatGPT-maker OpenAi. The new tools mark a major shift towards AI for Apple which, up till now, has integrated fewer AI features into its consumer-facing products than competitors.
Apple’s new artificial intelligence system involves a range of generative AI tools aimed at creating an automated, personalised experience on its devices. The AI will be integrated throughout the operating systems on its Mac laptops, iPad tablets and iPhones, and will also be able to pull information from and take action within apps.
![[object Object]](https://hedgehoglab.com/wp-content/uploads/2024/06/Apple-mobile-apps.png)
More particularly, they are focusing on the following tools:
Writing tools
One example of the new Apple Intelligence in action is say you’re writing an email, you can change the tone to make it sound more friendly, professional or concise. Another way, the new writing tools can help you is with proofreading, this feature can help correct your grammar, word choice and sentence structure at the click of a button and if you have a lot of information in your email, you can now add that information into a TLDR with the main points covered.
Genmoji
No more limitations on emoji’s anymore, you can now type into Genmoji what emoji you’d like to create. So say you’d like a blue hedgehog, you’ve now got a blue hedgehog . Apple Intelligence also knows who’s in your photo library so you can turn your friend into a wizard for example. Some people on our team are really excited about this feature 😍
Image Playground/Wand
Create incredible images in seconds…Just type a series of descriptive words into the generator and it will produce the image in the middle playground area and edit as you like. As mentioned previously Apple understands who people are in your library so you can create a personalised image of them for their birthday.
Image wand can transform a rough sketch you’ve made into a polished image, by using Apple pencil to circle your drawing and it analyses your sketch and creates an image for you.
Siri
The new Siri in iOS 18 is getting smarter with a major AI overhaul that will allow it to function as a conversational AI chatbot, receive written instructions, and take actions within apps based on voice prompts. You can now speak to Siri more naturally, thanks to richer language understanding capabilities, so if you stumble over your words, Siri still understands what you’re getting at.
Its new personal context feature draws from your photos, calendar events, messages and other apps so that you can quickly find the information YOU need, for example, through the cloud, Siri can now search through the user’s emails, text messages, and photo library to find specific information based on the context of the voice request, like “send photos from the barbecue on Saturday.”
If you don’t want to ask Siri out loud, you can now quickly and quietly type your Siri request.
This naturally made us think about privacy, and Apple has stated that user privacy will be protected – requests sent to ChatGPT will have IP addresses obscured, and OpenAI won’t store the requests.
Introducing Swift Assist
In new development tools we saw the release of Swift Assist.
Swift Assist is a new AI-powered coding assistant integrated into Xcode. It serves as a companion for all coding tasks, allowing developers to explore new frameworks, experiment with ideas, and generate code using natural language prompts. Swift Assist leverages a powerful cloud-based model while keeping code private and secure.
![[object Object]](https://hedgehoglab.com/wp-content/uploads/2024/06/Laptop-hands.png)
Let’s hear from our developers:
“I think generative code completion and Swift Assist have definitely excited the community. We’ve seen the convenience and productivity improvements we can get from generative completion in other environments (github copilot, for example), so it’ll be great to see that in Xcode.”
Owain Brown – Mobile Engineering Manager
“Swift Assist leverages machine learning to provide suggestions to the developers, like error detection and refactoring options which in my opinion will be very helpful not only for the junior devs but also for seniors by extending their options on a given problem.
It features AI-Powered Code Completion trained to provide code completions that will help developers to write code faster. Also introduces refactoring recommendations that help maintain a clean and efficient codebase.”
Philip Konyarov – iOS Developer
New testing framework for Swift
Swift Testing is a new open-source framework from Apple that provides expressive and intuitive APIs for writing tests in Swift. It aims to make testing Swift code easier and more intuitive compared to the existing XCTest framework.
“Swift Testing is not a revolutionary new testing framework, but rather an evolution of Apple’s existing XCTest framework. It provides a more Swift-centric approach to writing tests by leveraging Swift language features and macros to make tests more expressive, readable, and easier to author.”
Owain Brown – Mobile Engineering Manager
Embedded Swift
Embedded Swift is a great new addition. It’s a language subset, that turns off some language features that need runtime support like reflections, “any“ types to produce suitable binaries for embedded electronics and its aimed to extend Swift’s capabilities for embedded systems development.
“This feature allows developers to write Swift code that runs directly on microcontrollers such as Arduino, ESP32 and other resource-constrained devices. It features optimised runtime for low-power and low-memory devices that ensures efficient performance on devices with limited resources. It seems like it comes with libraries that support many hardware interfaces such as: GPIO, I2C, SPI, and UART that make electronic components “talk” to each other using swift code. For me it’s a groundbreaking new technology that will help developers to interact with the embedded technology using the modern swift syntax and because Swift is open source this technology will continue to grow with the help of the Swift community.”
Philip Konyarov – iOS Developer
Conclusion
Overall, it was an incredible event with many exciting new tools revealed that we can utilise across our user’s experience and journey with our products. For all the new updates from the Apple event, click here. If you’d like to hear more from our developers, follow them on LinkedIn by clicking on their names, as well as the hedgehog lab official LinkedIn page to stay up to date with all the developer news and more.