Learn Core ML, Vision, ARKit, Drag & Drop, NFC reading, PDFKit, MusicKit, and more
Note: All finished projects and source code have been update to swift 5.0. So even though some lectures are shown using Swift 4.2, just reference the source code for any update you might need.
What you’ll learn
- Work with Apple’s newest API’s.
Course Content
- Introduction – Welcome –> 1 lecture • 2min.
- Whats new in Swift 4? –> 1 lecture • 13min.
- Quick overview –> 4 lectures • 43min.
- Project 1 – Trade My Tesla –> 5 lectures • 1hr 1min.
- Project 2a – Sightspotter –> 4 lectures • 25min.
- Project 2b – Sightspotter using the wikipedia API –> 3 lectures • 28min.
- Project 3 – Going Postal –> 5 lectures • 52min.
- Project 4 – Swift Sampler –> 5 lectures • 45min.
- Project 5 – Picture Protector –> 4 lectures • 38min.
- Project 6 – Name That Tune –> 7 lectures • 1hr 17min.
- Project 7a – Image XRay –> 2 lectures • 11min.
- Project 7b – Video XRay –> 4 lectures • 47min.
- Project 8A – iOS Design Techniques –> 4 lectures • 25min.
- Project 8B – Techniques for extending drag and drop –> 2 lectures • 20min.
- Project 8C – Techniques for reading depth in photos –> 2 lectures • 19min.
- Project 8D – Scanning NFC tags –> 2 lectures • 9min.
- Project 8E – Detecting the features of a face with “Vision” –> 2 lectures • 16min.
- Frequent Fyler Club –> 1 lecture • 2min.
Requirements
Note: All finished projects and source code have been update to swift 5.0. So even though some lectures are shown using Swift 4.2, just reference the source code for any update you might need.
iOS gives us a whole range of powerful new tools for you to build intelligent apps, and this course helps you get started with them as quickly as possible: Core ML, Vision, ARKit, and more.
Machine Learning: Take advantage of Core ML to deliver intelligent new apps that can proactively assist your users.
ARKit: Learn how to augment reality with slick interactive graphics thanks to Apple’s new ARKit framework.
Drag & Drop: Let users move data freely inside your app and others too using powerful new multi-touch gestures.
Plus: NFC reading, PDFKit, Vision, MusicKit, and more!
This course is written by the award winning Swift programmer, Paul Hudson, it is from his best selling book, “Advanced iOS: Volume Two”, made into a lecture style course, and these videos were made with his permission and support. Paul is the author of the Hacking With Swift series of tutorials, which is one of the most popular Swift coding sites online. Paul also has received high praise from the creator of the Swift language, Chris Lattner, for his outstanding method of teaching, and series of Swift tutorials. And working together with iOS developer Steve DeStefano, the Hacking with Swift series of programming training videos are simply the fastest way to learn how to code in the Apple eco-system.
This course incorporates the use of annotations, callouts, diagrams, highlighting, and deep explanations that help make complex subject matter, much easier to grasp, and guide you along the path of the code, each step of the way.
Please note: These are intermediate to advanced projects, and this course will not teach you the Swift language, or how to code, you should already have completed Paul Hudson’s Swift tutorials at HackingWithSwift dot com, or gone through the Hacking With Swift beginner to pro course here on Udemy.
Advanced iOS: Volume Two includes seven new projects that teach some of the most powerful features in iOS, plus technique projects that deep-dive into smaller features:
• Project 1: Trade my Tesla Teach iOS how to evaluate the trade-in price of cars using machine learning.
- Project 2: Sightspotter Blend augmented reality with Core Location to help tourists find their way around a city.
- Project 3: Going Postal Create fun postcards using images, text, and colors, all powered by iOS’s new drag and drop APIs.
- Project 4: Swift Sampler Learn to render PDFs with watermarks and interactive thumbnails, all using PDFKit.
- Project 5: Picture Protector Build an app that detects faces in photos and lets users interactively blur any they want.
- Project 6: Name that Tune Use MusicKit to fetch data from Apple Music and build a fast-paced two-player charts game.
- Project 7: Video X-Ray Let users record a video while the Vision framework silently detects and records all objects in the scene.
- Along the way you’ll learn all this and more:
- How to use Core ML and the Vision framework to leverage machine learning in your own apps.
- Building both games and apps from scratch using ARKit’s augmented reality technology.
- How to integrate drag and drop into your apps using table views, collection views, and more
- The all-new PDFKit framework, delivering fast and efficient PDF rendering on iOS.
- Using the new depth-mapping APIs exposed through the iPhone camera.
- Automatically detecting and blurring faces using the Vision framework.
- Reading NFC tags using Core NFC.
- Taking advantage of the new MusicKit APIs to work with Apple Music.
- And more!
- If you want to take advantage of some of the most advanced functionality provided in iOS, there’s no faster way than the Advanced iOS series.