[Applause]
>> I fully understand that this is a weird
question to ask in a city known for bicycles,
but show of hands, who here drives a car?
Oh, wow, that's actually a lot of you. That
was, like, 50% of the crowd just said that.
Second question: Who, here, has ever heard
of Apple CarPlay? Almost everyone. So, for
the ones who don't know what this is, CarPlay
is a feature that lets you use your iPhone
on your car's counsel and then use all of
the iPhone's capabilities totally hands free
while you're driving. And so, according to
Apple's marketing site, CarPlay is the ultimate
copilot. It's the smarter, safer way to use
your iPhone in the car. You can get directions.
You can make calls, you can send and receive
messages and you can listen to music all in
a way that allows you to stay focused on the
road.
So, my boyfriend recently just bought a car
and he chose his particular model because
it came with CarPlay. And he was super excited
about it so we decided to go on a little road
trip down the coast of California to test
it out. And I ended up documenting some of
the more interesting moments of using it.
So, I made a little movie for us. It's called,
here's what we expected and here's what we
got.
[Laughter]
Video:
>> Using voice commands, Siri's ready to help
you at any time, without being distracted
or losing sight of what's important in your
car.
>> Zoom the map out.
>> Ask Siri to play your favorite song. Play
Little Ones, Catch the Movement.
>> Catch the movement, catch the movement.
>> Play Sufjan Stevens, The Black Hawk War.
>> The Black Hawk War; or, How to Demolish
an Entire Civilization and Still Feel Good
About Yourself in the Morning; or, We Apologize
For the Inconvenience, But You're Going to
Have to Leave Now; or, I Have Fought the Big
Knives and Will Continue to Fight the Sufjan
Stevens, now playing.
>> Send and receive messages.
>> You have 19 messages sent to the group,
Disney Parks [Indiscernible]. Today, Elaine
said to the group: "Attachment, one image."
Today, Louie said to the group: "Disliked
an image." Today, Louie said to the group:
"Laughed at a movie." Today, Marco said to
the group: "Https:// [Indiscernible]." Would
you like to reply to the group?
[End of video]
[Laughter]
>> He spent a lot of money on that. Oh, that's
real. Yeah. It's totally real. So, it's pretty
clear what went wrong here, right? Like, the
it's very obviously that no one or at least
it seemed like no one, at Apple had stress
tested CarPlay's designs in real world conditions
because if they had, they would have realized
that when you go on the road trip, for the
most part, your cell service is totally gone.
Nor do most play lists look like they consist
of perfectly short song titles.
And, the reality of driving is that when I'm
a driver, I actually really need to see the
map a lot more than I need to see a giant
animation of Siri thinking. So, this kind
of train of thought has been talked about
a lot today and I really want to amplify this
point. Which is that creating things that
stand up to reality is really freaking hard.
Because it's not just about using real data
in your designs. Our responsibilities, as
product designers or just designers in general,
is to make sure that our products have to
work for as many people as possible, in all
different scenarios, in all different environments.
And, honestly, that is almost an impossible
task because we can't predict everything that
will go right or wrong. And, a lot of us don't
get the pleasure of having giant user research
groups or AB testers or data analysts to help
us out with this. Although I did hear that
there were a bunch of Facebook designers here,
so maybe you're not one of these.
But for the majority of us that wasn't a trash
on Facebook. I like you guys. You guys are
great.
[Laughter]
But for the rest of us, you know, all we can
really do is just to make sure that in our
own process, we are taking the right approach
to vet our work against the stresses of real
life. And, I wanted to kind of share some
of the approaches that I've come across and
the teams that I've worked on. So, I'm currently
a design manager at a company called Lyft.
For people who are not from the United States
here, we are a transportation and ride sharing
company. The team that I manage is called
core design and we take care of Lyft's design
system. And so, we have a pretty high standard
of quality for what we make. Everything we
make needs to be designed with the utmost
usability, covering all edge cases and being
able to capture a completely universal audience
as possible.
And what enables us to do this is a realistic
understanding of how things like ergonomics,
focus and attention, accessibility and safety
impact both our passengers and our drivers.
So, like, May Li said, my previous job was
a designer on Apple's Prototyping team. And
our job there was to figure out how people
will use and react to new technologies in
their lives. And I think, like, one of the
biggest takeaways from working on that team,
and also from working at Lyft, was that prototypes
and when I say prototypes, I mean the really
crappy, ugly, quick prototypes are one of
the single best methods to understanding how
our designs are going to fair in the real
world. So, hopefully prototyping isn't news
to any of you because we are at a Framer conference
so I'm not going to go ahead and explain what
it is. But I did want to talk about fidelity
and about intention.
So, this is my kind of diagram of all of the
different extremes that prototypes can be.
You can have low fi, you can have high fi.
You can have prototypes that are really aware
of your contexts and prototypes that have
no idea what's going on. And, I think, like,
the point I'm trying to make
[Laughter]
is that we don't want to be here.
[Laughter]
[Applause]
We want to be here. And, like, one of, like,
the things that I always try to communicate
is that you can increase and increase the
fidelity of the prototype but never get to
a position where you never know anything more
about the design problem you're solving so
that's where this sort of danger zone happens.
So, you can imagine maybe you're, like, jamming
on this awesome, new digital watch interface
on your computer, but then if it never makes
it on to the wrist for you to actually try
out, then you're not learning anything.
So, where we want to be is here, making super
low fi prototypes are that are being put into
the context, which will they'll be used and
then iterating and iterating and iterating
on them so you can make higher fidelity prototypes
and then eventually those become your demos,
the things you present to executives or whatever,
to get buy in our funding for your idea.
And so, here, I'm going to show some strategies
we've been using at Lyft to add context into
our early designs that are definitely on the
lower fidelity side. The first strategy is
sort of just, take your designs off the computer.
So, an example of this is when we were trying
to understand passenger and driver ergonomics
when they're viewing our app, one of our designers,
Lindsay, she quickly mocked up this prototype
with selfie sticks she bought off of Amazon
to mock up what the typical size for viewing
distances are for drivers and for passengers.
And this was really freaking awesome because
we were able to kind of carry these around
in the design studio and test them out and
we would put different components that we
were making for design system on them, to
test out, like, oh, is it tappable for this
area? Can I view it? Is the font that I've
chosen legible?
And so, here is a non interactive prototype,
but what we were trying to ask is, there was
this big debate about whether all of our buttons
should be pink because they're Lyft or purple
because they're sort of Lyft. And so, to answer
that question, what we did is we brought all
of the screens on to the devices that we would
actually be using. So, here they were on Android
devices, and I don't know if you can tell
nope, not from the color calibration. But
that Android model, in particular, was incredibly
shocking, the amount of pink that was happening
on that screen and so we didn't want drivers
to really have to deal with that on a day
to day basis, so we ended up going with purple.
And then in our very recent redesign of our
passenger app, we prototyped on devices so
we could test out the reachability of important
elements. And something that we decided on,
based off of this, was to kind of bias all
of the UI to the bottom of the screen. And
so, the reason we can do that is so that if
someone is, say, like, just walking down the
street, they're in a lot of busy traffic,
they're carrying, like, a bunch of bags, they're
able to request a ride from Lyft with just
one hand, without having to move up and down
the phone.
And, one last example here is in the earlier
stages of self driving cars, when we didn't
actually have the car, we made this scrappy
prototype that was made out of, like, foam
core and some office chairs and we used it
so you could really kind of immerse yourself
in, like, what the experience would be like
if you're surrounding by, like, three of these
screens. What is kind of, like, my you know,
my vision arc here, how tappable are the elements
on all of these displays?
And then, another tactic that we used is to
act it out. So, a lot of the interactions
we have with our app happen outside in the
real world, so we tend to leave the office
to kind of grab insights.
This one was a really awesome example where,
in order to mimic the situation of a passenger
just trying to find their driver, we just
had two people act like the car and then the
other one just had a clipboard that was their
app or their phone and this way, like, they
were just trying to find each other on a busy
interaction and it was a really great way
to build empathy about, you know, what the
context of being on a busy, heavy trafficked
street would be like, while trying to have,
like, a goal in mind.
And so, the thing I really loved about this
is that it ended up designing and prioritizing
the content that we would put on our app without
actually having to jump on to any type of
computer. And we've also done this with self
driving, as well. So, we invited a bunch of
people to take a fake autonomous vehicle and
we did that so we can find out what their
reaction would be to certain information on
the screen versus what they were seeing out
in the real world. And, the prototype that
we made, it was literally like an iPad that
we wrapped a bunch of saran wrap over the
head rest so it took, like, two seconds to
make.
And then the last thing we tried to do is
use realistic data. Usually, we don't always
have the real data to use. So, we try our
best to simulate conditions. For our self
driving explorations, for example, we didn't
actually have the real censor data for what
the car would look like or what the car would
see, so what we did, instead, is that's very
loud. We ended up driving a car around San
Francisco and getting, like, recordings of
all the different objects that a car could
sense in the environment and this was just
to immerse ourselves in what are all the objects
we have to pay attention to? Like, bikers,
pedestrians, stop signs or other cars?
And this footage ended up informing a ton
of rounds of designs that we were making to
figure out how best to prioritize and visualize
objects around the car. As you can see, even
some of the basic, like, unity prototypes,
we didn't try to do anything special here.
We just used very basic rectangles to just
corner out what objects were where.
And from that, we kind of, like, iterated
further and further and further into a higher
fidelity and then came up with this as one
of our final ish prototypes that we used to
demo our confidence in this idea.
So, those were some examples of how we use
contextual prototypes to remove the tunnel
vision that can happen when you're just on
your computer.
Where we jump out of low fidelity and really
get into the code is when we're trying to
make design tools that help us bring even
more context into our designs.
So, Kevin and I don't know where he is. I
know he's not going to raise his hand. So,
Kevin is here somewhere, in this stage, and
he's on our team. He made a set of tools for
us that let us programatically generate colors
that work with both our brand and our accessible
color contrast values on the web and so I
thought this was such a fantastic prototype
tool because you get to play with generating,
like, multitudes of colors and then being
able to see if they're accessible on white
and if they're accessible on black and we
are actually thinking about open sources this,
probably next week. Yeah.
[Applause]
That's claps for Kevin.
So, he also made us a tool to figure out the
optimal font size and legibility of any given
typeface that you would type into there. So,
basically, you type in Helvetica, it gives
you what the font size should be for our passengers
and what the font size should be for our drivers
and this was so incredibly useful because
we were, at the time, searching for a secondary
typeface to use in our app and this sort of
validated, like, Proxima Nova, which is what
we ended up choosing.
And then something our engineering team created
for us was this prototyping tool that allows
us to tactically interact and design screens
with our design system components right on
an iPad. And so, that's kind of, like, really
awesome because it's letting us get a more
realistic feeling of our UI and you're able
to see things like what the loading state
is, how the keyboard interacts. So, it's pretty
rad. Also, Framer X did not exist, at that
point, so this is why we had to do that.
And so this is sort of where, like, when we're
talking about the future of design and the
future of design tools, this is what gets
me absolutely excited because this is what
I want. I want tools that are bringing the
designs we make, the problems we are facing,
closer and closer to the actual reality that
they're going to live in.
And, before I go, I just have one, little,
kind of PSA to share: Please share your ugly
first iteration prototypes and please share
your tools. We make it really hard on ourselves
by only showing kind of, like, the best looking,
most time consuming work that we've produced
and it's really the stuff that we learn from
other other experiments, other, you know,
low fidelity prototypes, that can actually
benefit other people to hear from.
So, thank you.
[Applause]
