Telerik blogs

In this week's podcast, we cover what you should expect when building augmented reality apps on iOS, Android or cross-platform.

On this episode of Eat Sleep Code, TJ VanToll discusses the current state of building iOS, Android, and cross-platform AR applications. TJ shares his insight on what technologies are currently available, how to get started, and what to expect when building mobile AR applications.

You can listen to the entire show and catch past episodes on SoundCloud. Or just click below.

TJ VanToll

TJ Vantoll

TJ VanToll is a frontend developer, author, and a Principal Developer Advocate for Progress. TJ has over a decade of web development experience, including a few years working on the jQuery team. Nowadays, he spends his time helping web developers build mobile apps through projects like NativeScript.

Show Notes:

Show Transcripts

If you prefer reading to listening, we've got a complete transcript for you as well. Dig in below:

00:00 Ed Charbeneau: This podcast is part of the Telerik Developer Network, Telerik by Progress.




00:16 EC: Hello, and welcome to Eat Sleep Code, the official Telerik podcast. I'm your host, Ed Charbeneau, and with me today is TJ VanToll III.


00:26 TJ Vantoll: Yes. [chuckle]


00:28 EC: We have Ed Charbeneau II.


00:30 TV: Oh man.


00:31 EC: So, we have the royalty of DevRel, right?


00:33 TV: Yeah, you could introduce me as Theodore VanToll. Well, actually Theodore Joseph VanToll III, if I wanna sound especially regal.




00:43 EC: So Mr. TJ, I'll call you TJ for short, you are a co-worker of mine, a fellow Developer Advocate for Progress, Senior Developer... Oh sorry, Principle Developer Advocate for Progress.


00:57 TV: Yes.


00:58 EC: And, I guess, kinda explain to listeners what that entails.


01:02 TV: So, I just work as a Developer Advocate and I primarily focus on NativeScript, which is our tool at Progress here that helps developers build iOS and Android apps using JavaScript. And came to this job just because I have some background as a JavaScript developer. I sort of got started with JavaScript back in JavaScript's dark days and have seen it through to now, where we're doing some pretty cool things with the language.


01:27 EC: The dark days are over?


01:29 TV: Yep, they're over. We're no longer... JavaScript's used for more than form validation nowadays, so...


01:35 EC: I kid, I kid. [chuckle] I love JavaScript, too. So today we're gonna talk about AR development and we're gonna talk about AR specifically in the context of Mobile develop. We kinda had a little chat before the show, why we might narrow it down like that, and the gist of it is that there's just a whole lot to talk about.


02:00 TV: Yeah, I mean, AR is an enormous topic. It dates back decades, it's been used in different contexts. There's headsets, there's all sorts of things that people are trying to do with it, but I think the thing you hear in the news more are sort of one of the latest and greatest things in the AR world is just AR and iOS and Android. It's just becoming a lot more common, a lot more hip thing for developers to try out and use.


02:28 EC: Now I'd assume most people have heard of AR, but just for those who may have not, what's the elevator pitch?


02:37 TV: So AR is short for Augmented Reality, and really, the term, it gets applied in a lot of different ways, but the easiest way for me to think about it is just anything digitally that's placed into reality. And the example I like to use because I am an avid Pokemon Go player, and the example there in Pokemon Go, it places digital Pokemon, Pokemon that are not really there, into reality, when you view reality through the lens of your phone. So if you put your phone in, these Pokemon appear there. So that would be one example of AR. But really, there's tons of potential examples, things... Everything from Snapchat being used to filter and sort of augment or change reality on the fly. We'll talk about different apps as well, but just sort of your broad definition is, just anything digital in reality.


03:33 EC: So that can also include things like sound and other senses, but in general, at least for the current state of mind, it often refers to something visual. So, that's kinda where we're at, in my opinion, with the technologies. Generally refers to something visual being overlayed, but you could include sound and other things in that augmented reality definition too.


04:00 TV: Yeah, I didn't even think about that.


04:03 EC: So, it's getting to be a very popular topic and I believe that this has been in the making for quite some time, and we're just starting to see the leading edge of this technology pick-up. So what is your opinion on why that's happening right now?


04:20 TV: So, I think we've long-known that there's some really good and really cool use cases of AR. The one problem with the technology is that it's fairly processing intensive. If you think about what you have to do, like what a technology that wants to create, say, like augmented graphics, really it not only has to take some sort of digital image and place it into some real world, some reality, but there's a lot of math and a lot of calculations that go into that. You have to be able to sense depth of everything around you, you need special sorts of cameras to be able to detect that sort of thing. There's a lot of positioning involved, especially as the lens you're viewing this thing moves. So if you're wearing a headset and you shake your head around or you just straight up move around, you have a phone, you move around, whatever is projecting that image has to be able to re-calculate these things on the fly, and you just straight up need a lot of computing power to make that happen.


05:16 TV: And if you look back at older applications of AR, oftentimes these things were driven by big mainframes or massive computers and things like that. And it's only fairly recently with just advances in processing power, computing power, that we've been able to build AR into smaller and smaller devices and within the last handful of years it's gotten small enough that, really, we can run fairly compelling AR apps from our phones, which is also compelling just because we all have phones. So, whereas there's an entirely different conversation we could have about using AR and things like headsets or some of these other devices as well. The truth is, none of these headsets have really come down to a price point that your average consumer can just sort of pick them up pretty easily. And as such, the most common AR apps you hear out on the market today are on mobile devices, just because they have the distribution model in place. Everybody has a phone and not many people have a $3,000 HoloLens device chilling in the corner of their table ready to download the next app that comes out. So I think that's one of the big reasons, I mean, there are others as well. I think that...


06:34 EC: Yeah.


06:34 TV: Yeah. Go ahead.


06:34 EC: One of the ones that is often overlooked is the fact that AI or machine learning has picked up so much steam and the computing power leads to that as well that you mentioned, but the computing power enables the AI that's needed to do things like plane detection.


06:56 TV: Yeah. And like you said, it again comes down to computing power too, because then to drive those algorithms, you just need a lot of processing muscle to be able to make those things happen as well, especially since lots of times these are happening locally on the fly, just taking in feeds from a camera or something.


07:14 EC: Yeah. And we're covering specifically like handheld mobile devices, but it's just worth mentioning the HoloLens actually has a processor that's dedicated just to processing that three-dimensional data, and doing the detection of surfaces around you and stuff. That's one of the reasons it's so expensive but so impressive at the same time.


07:38 TV: Yeah. I think another reason that the headsets have had trouble gaining traction is because there's some UI challenges you face or... Well, it's not even UI, it's more like interaction challenges. Like if I'm wearing something on my head, how do I select something or click something or move something? And you have a whole new suite of problems that you have to solve, whereas if you're on a phone, you're sort of limited which is both a good and a bad thing. But if you are the one making the app, all you've got is a touch screen to work with, so it sort of simplifies the problem space that you're trying to solve, you just have a phone screen to work with and that's it.


08:15 EC: Yeah. At least there's something tangible you can touch, right? I mean...


08:18 TV: Yeah.


08:18 EC: You have touch screen, you can put your finger on, whereas some of the other devices that are out there, you're kinda left waving your hands in the air or using some kind of third party controller type of thing.


08:28 TV: [chuckle] I'm also waiting for... I've yet to see a headset design, and part of this is that these things, they have a certain amount of equipment you have to package into these headsets, they all look kinda horrible, they all look like you're out of a bad, [chuckle] like a really poorly done sci-fi movie from 20 or 30 years ago. The HoloLens is guilty of this, Google Glass made you look just like an absolute idiot. Really, all the other competitors. I'm waiting for someone to actually consult someone that knows something about fashion and design before they try to market these just because they... I don't know, at least in my opinion, they've all fallen short.


09:07 EC: Yeah. I have a talk that I give that revolves around this type of tech, and one of the takeaways from it is the fact that this human condition applies to everything that we do and one of those things, one of the human conditions of this is just looking like an idiot, [chuckle] wearing either these things on your head, or you're standing there waving your hands in the air like a lunatic, nobody else can see what you're looking at, so you just look like a fool just swinging your arms around. Reminds me of a Brian Regan joke where he's talking about walking into a spider web. He's like, "If you're the one walking into a spider web, you're waving your hands around, you know why you're doing it but the people looking at you think you're crazy. [chuckle] 'Cause you're just like walking into free space swinging your arms like a fool."


09:57 TV: Yeah. And really regardless, I think just as a high level, to answer your question of, "Why mobile?", if you're a developer and you're in the shower and you're struck instantly with this great app idea. If you look at your choices now, it's like, "Do I want to build for an OS that's iOS and Android that reach billions of people that can instantly have my app as soon as I put it on the stores, or do I want to go into territory like the HoloLens, where it's somewhat proprietary but more limited user set?" The answer is, you're always gonna pick the widely distributed platform unless there's a really compelling reason, like there's really something that the HoloLens or some of these other devices does that you absolutely need for what it is you're building. And there might be, especially if you're building something pretty high end, there might be some reason you really need to be on one of these headsets, but for your average app you really don't, and you can build something pretty decent for mobile devices and get it to a whole crap ton of people.


11:00 EC: Yeah. I think that distribution platform is a big deal. If you look at building something that's gonna be a commercial app, I think you have literally no choice but to go with a mobile device. If you're looking at a massive industrial application that you know a manufacturer or something that has really deep pockets that can afford to equip their workforce with some HoloLens devices and invest in some serious development time, you might have a chance to build something there, but for the most part, for the mass audience of developers out there, I think you're gonna be limited to those mobile devices.


11:47 TV: Yep, agreed. Yeah. They are certainly our cases. We at Progress, we're building stuff for those use cases as well, because if you're a business that could invest a lot of money into hardware like that, you do potentially stand to gain. There's some compelling use cases where AR can really help you out, and if you have the budget to spend on hardware, and also development time, you could potentially save a lot of money with a really cool app, but like you said, the barrier to entry is so much higher that it's a niche.


12:21 EC: So what are some good examples then of the lower barrier to entry? What are some commercial things that are out there that we could download and try or things that are coming to the App store?


12:35 TV: Okay, so I'm gonna start with my absolute favorite AR app. And even though it's somewhat of a silly example, there's an app called Flightradar, and it's out on the iOS App Store, so it's out for iPhones. I'm actually not sure I should know this, but I'm not sure if it's available for Android as well, but the idea is quite simple, which a lot of these AR apps... Really things are pretty simple, but the way Flightradar works is, there's an AR mode and if you hear a plane flying overhead, you can point the app, point your device's camera at the plane, and it'll just show a little tool tip on the plane telling you what plane this is, what flight number, where did it come from, where is it going, which I just find... I don't know, something about it just makes it incredibly amusing to me. And I use the app all the time, partially because I live fairly close to an airport, so I see planes every now and then. I'm just deeply curious.


13:30 EC: Yeah, there's some really cool apps that overlay data like that. One of the ones I think that's coming to market soon is by Google Maps, and it's going to overlay your directional data, where if you're walking down the street trying to find, say McDonalds or something, it'll give you the left and right turns visually overlaid on top of your camera so you know when to you know hang a left at the corner, type of thing.


14:03 TV: Yeah, and especially, I've wanted this for so long 'cause this is one of the things that Google Glass did back in the day, and I think about it all the time. Especially in larger cities where it doesn't necessarily have the greatest GPS accuracy or you don't know which way you're pointing, you're facing, that sort of thing, I could see myself for sure using that. So, I'm waiting for that to be live.


14:29 EC: Yeah. I'm waiting for the AR windshield, so we can just have that on our car. Of course, maybe they'll be self-driving by then and you won't have to worry about it.


14:36 TV: I'd like to have a little bit of both. [chuckle]


14:39 EC: Why not? Tech everywhere.


14:41 TV: Exactly.


14:43 EC: What are some super popular ones that are in the store?


14:47 TV: Yeah, I think the most common or the biggest AR app out there is Snapchat. And if you think, really, just AR doesn't have to be things like placing Pokemon, if we like Pokemon Go as an examples as well, or placing arrows into the real world, it can just be things like the filters Snapchat does. So really anything that changes reality in real time. So just a filter that changes your face, Snapchat has its face swap cons... What am I looking for? A face swap construct thing where you swap faces with another person. So those are examples of AR as well. Another thing that I found when I was researching this is that there's an absolute ton of makeup apps out on both the App Store and really they're appearing now in Google Play, where it's these makeup brands that just put there what their products out on the App Store and you just point the camera at your face and say, "I wanna see what I look like in this lipstick."


15:46 TV: And what I found is that they are quite good, surprisingly good nowadays at making these things fairly accurate and kind of fun. And I've actually been waiting for when is gonna be the... For a Skype app or a Hangouts app, or some sort of video chat app to start to integrate this sort of thing into real time. Like imagine joining a video call and having the option to clean up your face a little bit if you're a little tired or put on some lipstick, put on something to touch up your face, fix your hair a little bit, because from what I've seen on the stores, the technology is there, it's totally possible right now and it works pretty well. I think we're just scratching the surface of this sort of thing becoming more commonplace in our day-to-day lives.


16:34 EC: I won't get too political, but if you were a certain president of a country you wouldn't even have to comb over your hair.


16:43 TV: Exactly.


16:44 EC: If you were using Skype.


16:45 TV: You just need a filter. [laughter]


16:48 EC: We'll end the political discussion there. [chuckle] And we'll talk about how could we maybe build some of these things? Those are all great examples. What are some of the tools at our disposal to make something like that?


17:02 TV: So, for mobile, I mean, AR is not necessarily a new concept to mobile. These are things that you've always been able to build, but for the longest time the barrier to entry to AR was quite hard because some of the things we've talked about, just things like detecting planes and calculating coordinates and knowing the positions to place things like that, were things that you just had to hand-code, there was nothing built into the systems to make this possible. So while there were some apps that have been out on the stores for a very long time, one reason that we're seeing more AR apps on mobile quite recently is both Apple and Google have been putting out APIs to make developers lives a little bit easier when it comes to AR. So iOS, Apple at iOS released a series of APIs they call ARKit that shipped with the latest version of iOS back last fall. And then Android released their version of, [chuckle] because of course they had to call it something slightly different, Android has AR Core which debuted around the same time as Apple but they didn't call theirs 1.0 until earlier this year. I think it was either February or March, but they've brought it out of beta, they now have considered it production ready. So ARKit for iOS and AR Core for Android. And really what these APIs do is, they're just a series of building blocks to help you as a developer that wants to build AR apps a little bit easier.


18:38 TV: So I've spent more time with ARKit, so I know a little bit more about that, but ARCore is fairly similar and these include things like plane detection. One of the first things you mentioned. So if I'm pointing my camera out into the world, one of the things I might wanna do for building an AR app is know where flat surfaces are because if I know that, then I can place things on those surfaces whether that's Pokemon, whether that's furniture. There's an app, popular app on iOS called IKEA Place that basically just lets you decorate your room with virtual furniture. And to do that though, I need to know where I can place my furniture, so that if I put a couch down, the couch just doesn't fall off into infinity, it actually knows where to sit, where to look right where my camera is.


19:23 EC: I'm glad you mentioned the IKEA app because that's one of the first ones that I've seen come to the market and it's actually been in there quite a while and because of that history it has, you can see how the technologies transformed over some time. When that first came out, you used to actually have to print what equates to a giant QR code on your printer [chuckle] and you have to take that QR code and place it where the furniture would theoretically be in the room, and now it does plane detection.


19:58 TV: Again really it's funny 'cause I use it in demos all the time, 'cause it might as well be a tutorial for ARKit. It sticks to the built-in APIs so well, so the first thing it does is plane detection [chuckle] to get around manual QR printing and such, so it does that automatically.


20:15 TV: The second, which is, sounds basic but it's actually kind of hard is, it lets you add 3D models into a world, into a scene. So given the IKEA Place example, once you know where a flat surface is, you probably wanna put something on it. And so IKEA will have a library of 3D models for things like their different furniture sets, different art work, these sorts of things. And you can then do with the ARKit is there's APIs to say, "Okay. Add this model to this plane at this coordinate position." and ARKit will put it down there, there it is, you can see it, you could also... What's also hard to implement on your own, is ARKit will also remember where that model is as well. So given the IKEA Place example you can put a couch say in one room of your house, go walk around to another room, maybe go down the street for a little bit, come back and lo and behold, that couch will still be there because ARKit not only knows that plane is there, it knows where the model is, and it'll also keep track of things, as you move around and do other things as well, which again is not the easiest thing to hand code. So you're getting all this behavior now with pretty simple APIs instead of having to figure all of this out yourself.


21:35 EC: That's a lot of spatial data to crunch through to try to remember where things are placed. When some of these things first came out, they either just kind of floated in thin air in front of you, and wherever you turned they'd float along with you.


21:48 TV: Yeah. There's also a full physics engine built into this thing as well. So for example, if I take a couch and [chuckle] I could, although the IKEA Place doesn't exactly work this way but I could just take another couch and drop it on top of the existing one. [chuckle] And it knows like, 'cause it knows the boundaries of the couch. You can assign different 3D models masses. So you can say this couch weighs 100 pounds and this couch weighs 200 pounds and then it would know when one couch hit the other at a certain angle, how it's supposed to handle that collision. The probably the most common use case there is some sort of games. I've heard of people building like sort of in real world Minecraft type thing where you can have shapes of different sizes, of different densities, and you could start placing those around.


22:42 TV: There's companies too working on shared AR experiences. So you can imagine you're playing Minecraft in your driveway, but you're playing with three of your friends and you all have your phones out and you can interact with the models that your friends are dropping as well, and you're looking around sharing the same experience, but all viewing it through the lens of your phone.


23:05 EC: Yeah. That could be interesting for people that work in the car business, selling cars. You could see that car in your driveway, show it to your friends, [chuckle] that type of stuff.


23:19 TV: Yeah. Exactly.


23:20 EC: Opens up a lot of cool opportunities.


23:21 TV: Yeah. You could imagine going to a dealer, and the dealer is like giving you, having you download something to your phone and you could go home and seeing if it fits... The car fits in your garage well, how much space you have, like you said, take pictures of it with your friends. The interesting thing so as a Pokemon Go player, it's been funny. One of the things people... Actually the thing people do most with the AR mode is just take pictures. So they'll take a Pokemon, place it into the real world, and just try to take pictures of it. And so even there's like silly interactions like that, but you could see how it's pragmatic, like you said with the car example. It would be kinda cool before you drop a bunch of money in a car to just digitally take it home with you and put it around and try it out, see where it fits and things like that.


24:11 EC: So, we're talking mostly about the ARKit so far as... And some general application usage for that. What if we wanted to build cross-platform? Are there solutions for targeting a wider array of devices and operating systems?


24:33 TV: So like a lot of mobile APIs, ARKit and ARCore are fairly similar. They have their differences, but at the end of the day, both of them lets you detect planes, both of them lets you add 3D models, both have some sort of collision detection built in. So there lots of developers, if you're building something really advanced, you might want to leverage all of those things. But if you're building something basic, it... Again, like a lot of mobile test, it seems silly to build the same basic sort of experience twice. And so, a lot of popular cross-platform frameworks have developed plug-ins for working with AR and just to sort of lower the learning curve, because ARKit and ARCore are powerful, but they're also not the world's easiest APIs to work with and there's again, two sets of them. So you're working with them for iOS and then you have to turn right around and build the exact same thing for Android as well.


25:29 TV: So I'd say the most popular one that I see out there is, Unity has a really good plug-in for working with AR. That's the one Pokemon Go uses. There's some other apps that use it as well. I know Xamarin has one as well. React Native has a plug-in for working with these APIs. But the one I'm most familiar with for obvious reasons, is the AR plug-in we have for NativeScript. So with our plug-in, it's the same basic concept. Basically with NativeScript, you're writing one or your writing iOS and Android apps from one code base anyways, so something like AR is a logical extension of something like that because you could then drop in some AR capabilities and just again, keep working within one language and one code base for building these sorts of apps.


26:16 EC: And NativeScript, if you haven't heard of NativeScript before, it's XML, JavaScript and CSS that gets compiled into a native iOS or Android experience.


26:31 TV: Yeah, and not to get... [chuckle] We could drop into the code here which is great for a audio podcast, but the... At the extreme high level, what it would look like if you wanted to add AR capabilities to your NativeScript apps, like you said, with NativeScript you're defining your UI anyways in markup. So with our AR plug-in we have a markup tag that's literally called AR. So you drop in a angle brackets AR tag in and that defines the area of the screen that you'd want to show the user's camera because the camera's how you view AR experiences on mobile. And then you have the ability once you have that sort of AR region defined, you can either add things to it, like you can overlay text or different UI elements just because you might not want the AR experience to take up the entire screen. Like if you're thinking like a Minecraft type example, you might want certain blocks that you can select or dragged in or some other ways that you might wanna navigate to other parts of your app.


27:33 TV: So you have this AR experience, you've defined it as an area of a screen and then we also provide a series of APIs that you can use to deal with that. So for example, in your code that sort of backs this UI, you could say things like, "Detect play-ins" or "Add 3D model" or we have add box, add text, add sphere. Some of the basic APIs you'd expect to exist but the benefit we're really providing with you for data script is that you're only having to write this code once. So under the hoods, we've implemented this, using ARKit and ARCore but that's sort of transparent to you. And we've also provided simpler APIs than dealing with the raw abstractions that iOS and Android have. So hopefully we make your lives a little bit easier.


28:21 TV: And the last thing I should mention too is that, with NativeScript we also try to let developers always have access to the metal, have access to the raw APIs. So if you're coding mostly in a cross-platform way but you see some internet article and it says, "Oh ARKit actually lets you do this advanced thing" and you want to leverage that as part of your applications, you always have the ability to sort of eject out of the cross-platform friendly way and just write your native code as you need to when that situation comes up.


28:54 EC: And it's also worth mentioning that NativeScript is free, an open source and so is the plug-in. So those things aren't gonna cost you any money.


29:03 TV: Nope. And poll request and new features are welcome. So, if you dive in and you wanna add some new stuff to help make NativeScript developers lives easier, you are more than welcome to contribute.


29:14 EC: TJ works for coffee and Pokemon. [chuckle]


29:18 TV: Exactly.


29:19 EC: That's how we pay him.


29:20 TV: In that order. [chuckle]


29:24 EC: So other than NativeScript, are there any other cross-platform tools out there that we can use?


29:31 TV: Yeah. Like I said, Unity is a good one, Xamarin, React Native. I imagine that a cross-platform framework of any size probably has some offering. I haven't extensively tested them, so I can't tell you how good they necessarily are. I think there's almost a separate conversation we could have about the fact that I was in Android. So they basically almost steal each other's good ideas so often, that I like to think that cross-platform frameworks are becoming more and more feasible for more and more applications just because it's crazy, how often this happens, where one platform comes out with some series of APIs and the other platform thinks it's a good idea and just implements it, but slightly differently. Just enough to make your life painful if you were working purely native because you would have to write this twice, but from a cross-platform perspective, it's like, "Hey I just need a new implementation of these APIs, and our developers can just get on their way."


30:29 EC: They make them different just enough to get past the PAT lawyers, that's what they do. [laughter]


30:36 TV: Yeah. And I think too, it's also pretty even, like I don't think Apple... They sort of steal equally from each other, they've both been guilty of stealing each other's good and bad ideas over the years.


30:49 EC: So I like to talk about where I see the future of this stuff going and the analysis that I like to give is, we've started with big data and collecting all these mass of amounts of data and then we have AI to rationalize that big massive amount of data that no human could possibly be able to course through in their lifetime. And then now we have AR this way of visualizing all of that massive amount of data. So if you think about all the data that's spatially relevant and now you have a lens to look at that data through, I think we're gonna see some really cool stuff happening in the future.


31:35 TV: Yeah, and I think too, as headsets continued to evolve, because I think most people would agree that the ideal destination for AR is some sort of lens, whether that's a contact lens, or some sort of glasses or headsets or something like that. Just because there is still a little bit of awkwardness associated with using AR apps on your phone, because you always need to look through that lens in order to see the experience, and that [chuckle] can range from slightly awkward to extremely awkward. So think of your Google Maps directions example. It's cool that I could look at my phone and it would show me like an arrow of where I needed to turn, but how much better would it be if I didn't have to take my phone out of my pocket and the arrow was just there in my peripheral vision for me to see? So I think that's where things are going as well, but I think it's gonna be some time before these companies that are producing these things can bring costs down, can figure out how, solve some of the harder technical issues like, "How if I'm gonna mount this thing to my face, where is the battery gonna go" and all those sorts of things. But I think we'll be having a different conversation about these things in really, even the near future, five 10 years from now.


33:00 EC: I think in the next few years, we'll see kind of... I don't know if it'll be through channels or apps, or how this will kind of come about but I think Pokemon Go is kind of a decent example of how things might evolve to where you have either multiple apps that you open. And each one of those apps has spatial data based on where you are, where people have interacted with things, left digital objects behind and there's things that you're gonna interact with that you can't see without some sort of device and app combination.


33:38 TV: Yeah, there's somebody working on an app where it's basically you create... You can create AR art somewhere in the world and then just leave it there, right, for other people to discover and play with and alter, and that sort of thing. And I think there's enormous potential there for things like games. If I wanted to mark, like, maybe I wanna have an AR treasure hunt around a city. We've seen people building apps like that too, where I find something, I leave something, maybe I hide something for other people. There's business cases as well. Like if I'm a museum or some sort of tour organizer, think of how much more compelling you could make a tour that you run by yourself, if you could do it through AR especially if you bring in sound into the equation as well. You could build something really cool, and not have to try to have a tour guide shout at you to hear. You could just have it all right there with you.


34:34 EC: Yeah, I would like to see some historical tours where you walk to some historical site and see it in its heyday, that's through the augmented lens. That would be really cool.


34:48 TV: Yeah, you could turn it on and off. And 'cause we just did recently a tour of Boston maybe a year or so ago, and it's becoming more common for these companies to put out audio tours where you're supposed to, self-guided tours of areas. But those are always a little bit awkward because they always have to have steps like "Okay you need to find this area, look for this, and there's these weird steps and where they are, you could just say, "Oh I know you're here." Or it could even know where you're looking. Yeah, like, "I'm looking at this. Oh, you might wanna know that this is such and such."


35:27 EC: Yeah, there's definitely some really cool stuff in our future. So if you're listening and you wanna build some of this futuristic stuff, we have some resources for you. I know TJ you've written a couple of articles on how to get started. We can put those in the show notes. What are those articles about? We'll give everybody a kinda heads up what they are.


35:46 TV: Yes. Really, if you just Google NativeScript augmented reality we'll include links, but you should be able to find it just for some simple searches, and if you are the sort of person that likes dealing with native iOS and native Android just sort of raw, if you just Google, ARKit documentation or ARCore documentation, you should pretty easily be able to find what you need, and to get things going. I will give people a little bit of a warning, 'cause this is something that I hit too. AR is a lot of fun to work with the basics. I think regardless of what platform you use, it's not too hard to get the simple things like, "Oh I detected a plane, I hit at a box," and those things are quite easy, but there's a huge gap between being able to do that and say like, building the app that you might have in your head, like an incredibly compelling AR app. So [chuckle] I know this is something that I struggled with as I was preparing for demos and such. I got the basics running, it was really cool. AR is a really awesome technology to experiment with, and I really encourage everyone listening here to try it.


36:54 TV: But [chuckle] then I wanted to take this, this really simple thing I had and turn it into, to sort of a real app, and I kept running, hitting my head on the wall for dealing with some of these more advanced things, 'cause even though ARKit and ARCore sort of lower the barrier to AR, if you wanna build something really advanced, that learning gap is still there. And I think that's one of the reasons why we're not seeing quite as many really powerful AR apps on the stores quite yet, but...


37:27 EC: Math is hard. [chuckle]


37:28 TV: It is. But just even things like, I had this app idea where I would be able to label things around the house, and just had this great idea, but then once I got into the positioning logic of like, "Oh that's right, I have to concern myself with the angle I put the text at and the exact coordinate locations where I detect these things." And I'm learning why lots of times these AR apps have entire companies behind it, like Snapchat has bought a ton of companies that have specialized for years in building some of these things to make it possible. And I'm starting to see why, it's a fun space, like the potential is enormous, but the learning curve is there too. So you have to be willing to put in the time, if you're looking to do something really advanced.


38:15 EC: Yeah, I'm thinking people with background in like game engine design or usage even, would excel in something like this.


38:24 TV: Yeah, 'cause unless if you're like me, and you only know the basics of things like that, you'll be able to have a little bit of fun. I mean, I certainly don't regret the time I've spent, because I think it's a really sort of interesting set of APIs to play with, but you're not gonna see me making it rich in the next few months. I'm not quitting this on Caribbean Island 'cause of the next grade. Next grade AR app that I build from the ground up.


38:48 EC: So, I'm gonna coin this right now, that was TJ's augmented reality check.


38:53 TV: Yes, exactly [chuckle]


38:56 EC: [laughter] Manage your expectations, but it's cool stuff to play with, so get out there and have some fun.


38:58 TV: Yes. Play, but play with realistic expectations. [laughter]


39:05 EC: Don't quit your day job [chuckle] and seek out decent capital.


39:09 TV: Yes, this has been added in TJ's financial tips. [chuckle]


39:15 EC: Don't storm in your boss's office tomorrow, [chuckle] quit and cut the microphone.


39:22 EC: So we'll put those resources up on our website at under the developer central topic and we'll also link to that from SoundCloud. You could find us at SoundCloud/Eat Sleep Code Podcast. So we'll have the resources up for you there, TJ any last shout-outs, your Twitter account, anywhere we can find you, that sort of thing.


39:51 TV: Twitter is probably the best place. So, I'm TJ VanToll, pretty much everywhere on the internet. So on Twitter, that's @tjvantoll so T-J V-A-N-T-O-L-L. That should be about it.


Ed Charbeneau is a Developer Advocate for Telerik
About the Author

Ed Charbeneau

Ed Charbeneau is a web enthusiast, speaker, writer, design admirer, and Developer Advocate for Telerik. He has designed and developed web based applications for business, manufacturing, systems integration as well as customer facing websites. Ed enjoys geeking out to cool new tech, brainstorming about future technology, and admiring great design. Ed's latest projects can be found on GitHub.

Related Posts


Comments are disabled in preview mode.