[00:00:05.06] There was many of you know as you know we all want to hear. [00:00:09.06] [00:00:10.07] About the great pleasure behind us. [00:00:13.06] [00:00:15.10] Years. [00:00:15.22] [00:00:17.14] Kathy I think you're probably the youngest distinguished a lot of Relpax time that [00:00:20.21] [00:00:20.21] we've had so congratulations for that. [00:00:23.01] [00:00:24.18] I misunderstood got to. [00:00:25.18] [00:00:28.21] Remember the 1st time at. [00:00:29.19] [00:00:31.00] This where I was the advisor Pura Bajan Cup project wise but [00:00:35.20] [00:00:35.20] I've never before that because he can talk to me she was a student at Georgia Tech [00:00:39.16] [00:00:39.16] already and contact and software plus he'll get laid again let's [00:00:44.15] [00:00:44.15] just take this one here just it's worth us if we don't offer this. [00:00:48.21] [00:00:50.13] As saying I can make it happen I don't know how but [00:00:52.20] [00:00:52.20] we talked residential anyway and I sure guys actually [00:00:57.03] [00:00:57.03] were separated organize a group of French students to do projects in the class. [00:01:00.23] [00:01:02.01] It was pretty amazing I think that story to me says a lot about Kathy and [00:01:05.12] [00:01:05.12] her determination to just make things happen even when we're told possible [00:01:11.10] [00:01:11.10] so since there really honestly got to see the 2 long list [00:01:15.09] [00:01:15.09] of all sure just a few highlights She's one of the founders. [00:01:17.22] [00:01:21.01] Of ministration she was a 20000 Media Lab fellow just the point at [00:01:25.08] [00:01:25.08] the Harvard Kennedy School were teachers marriage but in society [00:01:29.04] [00:01:30.04] also with the Berkeley winds that were hugely helpful technology working group. [00:01:34.08] [00:01:35.20] With me about your Georgia Tech she's a polar opposite of what we're calling her. [00:01:40.09] [00:01:42.04] Which would. [00:01:43.14] [00:01:43.14] Connect to what you're talking about today when a time [00:01:46.12] [00:01:46.12] Ok let's welcome you back to what. [00:01:48.17] [00:01:53.02] I feel so good to be back thank you Farai Freeman having me this is wild. [00:01:57.11] [00:01:59.03] So scared. [00:02:00.14] [00:02:02.04] Of Thanksgiving lunch I really just want to read if there's a word that sums up [00:02:06.17] [00:02:06.17] my time here going to tack at a deep sense of gratitude. [00:02:10.15] [00:02:10.15] And I think I'm a tie that back to the work and [00:02:14.16] [00:02:14.16] response of computing and public interest technology and what that means but [00:02:19.04] [00:02:19.04] I really have this deep sense of gratitude for [00:02:21.12] [00:02:21.12] what Georgia Tech gave me one being an in-state student in Georgia was really [00:02:27.02] [00:02:27.02] how how many before how many folks in here undergrad computer science students. [00:02:31.13] [00:02:33.04] Co-op that many. [00:02:33.23] [00:02:35.00] How many are graduate students. [00:02:36.17] [00:02:39.03] How many are c.s. students. [00:02:40.17] [00:02:42.08] Please So other h.c.i. free to. [00:02:45.06] [00:02:47.16] Call and that's helpful but also my miss how many of you [00:02:52.13] [00:02:52.13] have worked in a big tech company however you would define big tech company. [00:02:55.23] [00:02:59.08] How many you have worked in government. [00:03:00.15] [00:03:04.15] You know if. [00:03:05.11] [00:03:07.00] You mean does University count as government [00:03:09.07] [00:03:10.09] as a joke because of the bureaucracy or as because it is interesting [00:03:14.18] [00:03:14.18] because resi institution how many of you have worked in a federal or [00:03:17.07] [00:03:17.07] state governments but is not an academic institution. [00:03:20.08] [00:03:22.02] Where did you all work. [00:03:23.02] [00:03:28.11] Which branch. [00:03:29.02] [00:03:30.09] Thank you for your service. [00:03:31.08] [00:03:32.10] My brother was r.t.c. [00:03:33.18] [00:03:33.18] here at a tech and he's an active military Marine where else that you are. [00:03:37.15] [00:03:39.19] Actually since I have you answering questions what do you think about tech and [00:03:44.12] [00:03:44.12] ethics I'll give you folks a few moments of think about this they got to economics. [00:03:49.13] [00:03:51.02] What pops in your mind at the top of the thing that pops into your mind about what [00:03:54.12] [00:03:54.12] that even means. [00:03:55.08] [00:03:58.04] And I would love for a few people to just say that out loud. [00:04:00.13] [00:04:02.04] Yes. [00:04:02.16] [00:04:03.23] Data privacy. [00:04:04.21] [00:04:06.12] She killed her veil of capitalism just for the what we needed. [00:04:11.23] [00:04:14.06] Of their. [00:04:14.18] [00:04:16.03] Data collection. [00:04:17.07] [00:04:18.18] Privacy data collection surveillance. [00:04:20.23] [00:04:24.02] Equity thank you for that a committee justice. [00:04:28.13] [00:04:30.07] Fairness and. [00:04:30.20] [00:04:35.17] Combating disparities. [00:04:37.05] [00:04:38.13] Were of public interest what does that mean. [00:04:40.15] [00:04:44.12] There's no uniform definition for any of us so [00:04:47.18] [00:04:47.18] anything you say is probably something that someone has also fought for he said [00:04:52.03] [00:04:52.03] What do you think of when you think of public interest or public interest talk. [00:04:55.02] [00:04:56.10] A mystery at the side until someone raises their hands. [00:04:58.20] [00:05:00.23] For service service to. [00:05:04.10] [00:05:06.12] Service to the public. [00:05:07.12] [00:05:10.04] And. [00:05:11.15] [00:05:11.15] Utilities let's agree there's actually a lot of conversation around utilities and [00:05:16.04] [00:05:16.04] technology and equity and justice as well. [00:05:18.16] [00:05:20.10] And you want all over here Ok. [00:05:23.00] [00:05:28.07] So. [00:05:28.19] [00:05:29.20] I have a lot of gratitude for this place because not only did it leave me with [00:05:35.00] [00:05:35.00] this amazing computer science degree the on its own and the c.s. part of it. [00:05:40.22] [00:05:40.22] Is internationally recognized and deemed as one of the best programs in the world. [00:05:44.21] [00:05:46.01] But then while I was here. [00:05:48.01] [00:05:49.06] You pretty much as an undergrad stumbled into [00:05:54.23] [00:05:54.23] research as a as a freshman undergrad I was in this building and met. [00:05:59.04] [00:06:01.01] Lena Brashear who let me do research with her as a 17 year old undergrad. [00:06:07.06] [00:06:08.10] And how much that opened the door to so many other things there are countless [00:06:12.21] [00:06:12.21] study abroad programs there were if you wanted to work across departments and [00:06:18.01] [00:06:18.01] figure out like biomedical engineering computer science or [00:06:21.14] [00:06:21.14] anything else there were just people here that made it [00:06:24.17] [00:06:24.17] possible to do any of that any of that work. [00:06:27.10] [00:06:28.23] And although I was talking to few folks I was a I don't know if it was is like [00:06:32.13] [00:06:32.13] the public nature of Georgia Tech of where I always felt like there was a sense [00:06:37.11] [00:06:37.11] of community and and working with the community while you're a student here and [00:06:42.02] [00:06:42.02] it took I think going away and [00:06:43.18] [00:06:43.18] seeing what other institutions were like especially some of the private [00:06:47.02] [00:06:47.02] institutions to really recognize how special it was to be at Georgia Tech. [00:06:50.09] [00:06:51.10] And and not only study computer science but [00:06:54.19] [00:06:54.19] have all these other fingers in all these other fields and areas and [00:07:00.02] [00:07:00.02] have like this deep sense of community with it and then I think what was also [00:07:04.00] [00:07:04.00] normal to me at Georgia Tech and I didn't realize this when I was here [00:07:09.01] [00:07:09.01] was the fact was you know I thought part of me doing deep research were best. [00:07:13.08] [00:07:14.21] And Amy I think best at one point you or I ran into you at some like i.b.m. [00:07:19.14] [00:07:19.14] thing in the Bay Area and I think you were pregnant. [00:07:23.02] [00:07:24.14] But that mattered because I was surrounded by these. [00:07:30.04] [00:07:30.04] Brilliant professors. [00:07:31.10] [00:07:32.11] Who had kids who had found leaves and are also doing great research [00:07:35.15] [00:07:35.15] who are advising companies and doing all these things and that was my normal. [00:07:38.15] [00:07:39.20] And again I think that's going to tie back to some of this work and [00:07:42.23] [00:07:42.23] what that really really means because going into the tech industry and [00:07:45.23] [00:07:45.23] going to work at a place like Google and realizing what I experience at your attack [00:07:50.02] [00:07:50.02] was the number of women who are and faculty positions who also. [00:07:53.23] [00:07:56.06] Were themselves at work I my my 1st year here I met Monica sweat. [00:08:01.19] [00:08:03.02] People like bar barracks and have lunch with me. [00:08:05.11] [00:08:06.12] And I was just. [00:08:07.23] [00:08:09.06] Like Jillian Hayes who's now the dean over at u.c.i. [00:08:11.23] [00:08:11.23] Marine Biggers may want all these people who taught me Computer Science plus. [00:08:16.21] [00:08:18.12] Bring your whole self to work plus thinking about society and communities and [00:08:22.10] [00:08:22.10] different kinds of ways and not a lot of computer science programs [00:08:27.07] [00:08:28.07] do that both in practice of what the cricket looks like but also [00:08:31.23] [00:08:31.23] what the faculty members are interested in and I think I bring this up here at [00:08:35.12] [00:08:35.12] Georgia Tech as I have a deep sense of gratitude for bringing that to me but [00:08:38.13] [00:08:38.13] also I think Georgia Tech is in a very unique position to really really lead [00:08:42.20] [00:08:42.20] in this time of public interest and tech and ethics because there's so [00:08:46.03] [00:08:46.03] much foundational work that's been done here with both the a.c.i. program and [00:08:50.06] [00:08:50.06] the g.b.s. center and even just other parts of the computing [00:08:55.18] [00:08:55.18] degree program even if it's not everyone that has been either [00:09:00.14] [00:09:00.14] community focused or been leading in ways that I think other places are. [00:09:05.17] [00:09:06.22] Aren't there yet so they give for being those examples for me and for [00:09:10.22] [00:09:10.22] allowing me to do the kind of research so. [00:09:13.11] [00:09:14.17] What I'll give you a little background of what brought me into this work and [00:09:18.04] [00:09:18.04] then I hope today will be spread to parts [00:09:21.19] [00:09:21.19] kind of what brought me into this work in the landscape of the space of Ethics and [00:09:25.22] [00:09:25.22] Public Interest tack and then what I see as the next. [00:09:30.02] [00:09:30.02] Things are on the horizon and areas where I think many of you either faculty or [00:09:34.14] [00:09:34.14] students can really have a big play in this. [00:09:37.10] [00:09:39.21] So I was an undergrad c.s. then went on to Georgia Tech. [00:09:42.06] [00:09:43.17] And I have a few but see earlier said something like [00:09:49.02] [00:09:49.02] it seems like you've had kind of an interesting career that went around and [00:09:54.12] [00:09:54.12] so I guess though what I was afforded this opportunity to jump are all these things [00:09:58.09] [00:09:58.09] because of the degree that you talk to me which is awesome so yesterday. [00:10:01.09] [00:10:02.21] I started out as Actually I was a co-op. [00:10:05.18] [00:10:06.23] At the Georgia Tech Research Institute working on flight simulation software for [00:10:11.03] [00:10:11.03] the department defense and realize I didn't enjoy that environment that much [00:10:16.10] [00:10:16.10] but it taught me a lot about what it was like to be a government subcontractor of [00:10:20.14] [00:10:20.14] sorts of just took that away in the back of my head for a little while. [00:10:23.08] [00:10:24.17] And then I think the most typical route of going to Google and [00:10:28.12] [00:10:28.12] being a product manager at Google for for a while. [00:10:32.18] [00:10:33.23] And then going on to grad school and I think another [00:10:36.15] [00:10:36.15] pivotal point was there is this how many of you have heard of the magic cup. [00:10:40.03] [00:10:42.02] Cause the magic cup is a play on the World Cup but [00:10:47.07] [00:10:47.07] for nerds so instead of going around the world and having soccer competitions [00:10:52.10] [00:10:52.10] you have Microsoft runs and you have these tech competitions with people around [00:10:56.23] [00:10:56.23] the world and there's a local competition in the state in your country and [00:11:01.11] [00:11:01.11] then you represent your country on the international level so [00:11:04.10] [00:11:04.10] I was in grad school and Mark Rey and I just. [00:11:06.11] [00:11:07.12] A one day with a computer lab and we're like let's enter the competition and [00:11:11.17] [00:11:11.17] we've built back in 2007 or 8 I don't remember the year a sentiment analysis [00:11:16.22] [00:11:16.22] system for our online communities which at that time seemed interesting and [00:11:22.17] [00:11:22.17] novel and now we know there's a lot of implication around sentiment analysis and [00:11:26.14] [00:11:26.14] I algorithms on online communities on Twitter was just starting out and [00:11:31.20] [00:11:31.20] we ended up winning 1st place representing the u.s. and Keith and [00:11:35.21] [00:11:35.21] up one of our just took time out of his day to be one of our advisors far for [00:11:41.06] [00:11:41.06] that magic but that was one. [00:11:43.03] [00:11:44.08] At that time that was like a close if I got to anything that seemed like [00:11:47.17] [00:11:47.17] a computing for good public interest thing and I remember computing the term [00:11:54.01] [00:11:54.01] computing for good was starting to become a thing I don't know this is true but [00:11:57.22] [00:11:57.22] I think that was even controversial around the term computing for [00:12:01.01] [00:12:01.01] a good consumer like why do you think you're doing computing for good I'm. [00:12:06.19] [00:12:06.19] I'm doing computing for get to because I'm going to change the world and [00:12:10.10] [00:12:10.10] I'm going to connect people and I'm going to spread information to people who don't [00:12:13.22] [00:12:13.22] have it and that's I think that moment also stuck with me this idea that [00:12:18.21] [00:12:18.21] yes we can have a computing for a group group which is really important but [00:12:23.14] [00:12:23.14] also what do we do when the whole body of Engineers also believe that everyone is [00:12:28.04] [00:12:28.04] doing good and I get put a plan to the opinion that somewhere but good to [00:12:33.23] [00:12:33.23] know that there's a backlash against using this term and what that really means. [00:12:37.10] [00:12:38.15] So that imagine how about exposure to a bunch of people who. [00:12:41.19] [00:12:43.15] Enter this competition because they want to build software to do everything from [00:12:47.00] [00:12:47.00] having a vending machine that reads your temperature so a doctor remote [00:12:51.21] [00:12:51.21] area can can assess you and that was a Brazilian team that got 2nd place. [00:12:55.18] [00:12:56.21] And it was a it was also exposed to all these teams from around the world who [00:13:01.20] [00:13:01.20] were able to think about solving tech problems in a deeply localized kind of way [00:13:06.02] [00:13:06.02] which also was a loss on hey. [00:13:07.23] [00:13:09.05] All of us sitting in the Silicon Valley can't think of all the world's problems [00:13:13.12] [00:13:13.12] and just build things and assume that we can solve problems in India and Brazil and [00:13:17.17] [00:13:17.17] me and Mar and Vietnam and China and all these other places [00:13:21.10] [00:13:21.10] just because their solutions are so so different and that also stuck with me and [00:13:26.03] [00:13:26.03] now at a time when we now see these big tech companies having really really [00:13:30.13] [00:13:30.13] difficult ethical problems when they have done a really poor job localizing. [00:13:36.04] [00:13:36.04] For example if Facebook were 2 or more to enter the market like me m.r. [00:13:40.18] [00:13:40.18] were out deeply understand the political climate of a place like me and [00:13:44.06] [00:13:44.06] Mar understanding the rapid growth of technology understanding how information [00:13:48.23] [00:13:48.23] spread understanding deep rooted racism and knowing how the spread of information [00:13:53.03] [00:13:53.03] there in a vacuum of information can cause certain. [00:13:56.06] [00:13:57.22] Human rights problems and so that experience was a lot of exposure [00:14:02.22] [00:14:02.22] to Texas from around the world thinking of ways to solve problems localized I saw God [00:14:08.00] [00:14:08.00] put a pin in that part 2 of the tortoise deeply understanding local regions [00:14:14.11] [00:14:14.11] as we build technology but also recognizing how most of my colleagues and [00:14:19.15] [00:14:19.15] peers and in our training we just don't learn about that very often. [00:14:22.20] [00:14:24.02] So that was a magic cup then went to Google and [00:14:26.10] [00:14:26.10] pretty much had back in less think 20078 The most [00:14:31.05] [00:14:31.05] typical experience which was you land at Google everyone tells you you're changing [00:14:34.21] [00:14:34.21] the world you're Do you know evolve which was still the logo at the time and [00:14:39.20] [00:14:39.20] every single t.g. I asked which was every single town hall we had was someone [00:14:43.23] [00:14:43.23] like a product manager who's on a stage just like jumping with excitement telling [00:14:48.02] [00:14:48.02] the whole company about something like you know we have this new feature it can be on [00:14:51.22] [00:14:51.22] your phone you can walk into a building at any point in time it can tell you exactly [00:14:56.19] [00:14:56.19] how long it'll take you to get home without even knowing because you step into [00:15:00.12] [00:15:00.12] the building at 5 o'clock and it knows you have to pick up your kids at 530 and [00:15:03.14] [00:15:03.14] I'll give you a notification and you'll do all these things and so seamless and [00:15:07.05] [00:15:07.05] it's so awesome We're going to change the world and that was pretty much [00:15:10.14] [00:15:10.14] every single town hall meeting and it stopped there there was never a. [00:15:15.19] [00:15:15.19] But what does that really mean for someone's privacy or. [00:15:18.22] [00:15:20.12] But what if someone used it in this way and what if you don't really want people [00:15:24.08] [00:15:24.08] knowing where you are and there was never that kind of a conversation and [00:15:28.13] [00:15:28.13] I also never saw any issue with that because no one talks about it [00:15:32.13] [00:15:32.13] at these companies I don't know those for those of you who work at big companies or [00:15:36.17] [00:15:36.17] know this jives of how you've experience. [00:15:38.15] [00:15:39.18] Being there but up until probably 3 or 4 years ago it was still very much a look [00:15:44.18] [00:15:44.18] at all the ways we're going to change the world because we have these skill sets and [00:15:48.23] [00:15:48.23] it's going to be awesome and tech is great. [00:15:51.10] [00:15:53.11] And not necessarily think about some of the negative side effects and [00:15:58.03] [00:15:58.03] I think I actually think that's where there definitely have been researchers at [00:16:01.17] [00:16:01.17] Georgia Tech who've been thinking about that for a long time even if it's not [00:16:05.14] [00:16:05.14] exactly phrased as ethics or responsibility or anything like that. [00:16:11.10] [00:16:12.18] So that was the experience at Google for a period of time so the Google break. [00:16:17.23] [00:16:19.18] Went to i.b.m. and this is another beast so I when I think of the ethics and [00:16:24.13] [00:16:24.13] responsibility conversation we have the. [00:16:26.15] [00:16:27.18] The bios and the fairness and the examples of echo chambers on Facebook and [00:16:33.06] [00:16:33.06] Google labeling humans as gorilla was we have discrimination and [00:16:38.14] [00:16:38.14] Sharon's policies etc and then we now also have another world of at these [00:16:44.07] [00:16:44.07] conversations where we've now seen many employees at big tech companies I mean if [00:16:48.12] [00:16:48.12] you're familiar with the walkout that big tech companies it's pretty widely covered. [00:16:52.07] [00:16:53.14] Whereas the other side of that is. [00:16:56.17] [00:16:56.17] Let's say we've got all the technology the best ability we have a piece of [00:17:00.07] [00:17:00.07] tech because the issue with the Google and Facebook scenario is you're like [00:17:05.03] [00:17:05.03] Ok engineers aren't really trained to really think about some of these problems [00:17:09.14] [00:17:09.14] when we're building technologies for different countries we don't think about [00:17:12.05] [00:17:12.05] localization when we have certain data sets that not comprehensive enough [00:17:17.07] [00:17:17.07] therefore we have bias etc etc But then we also have another wall with i.b.m. [00:17:22.19] [00:17:22.19] type where how many of you are familiar with i.b.m. in the Holocaust. [00:17:26.06] [00:17:27.11] Is that there's a book called i.b.m. [00:17:28.18] [00:17:28.18] in the Holocaust the gist of it is that i.b.m. as a contractor or a company. [00:17:33.09] [00:17:34.23] Do what they do best which is they sell to organizations including governments [00:17:39.23] [00:17:39.23] around the world and their technology was used during the Holocaust and [00:17:44.23] [00:17:44.23] the ethical question there for people is if you're an engineer on that team of [00:17:49.13] [00:17:49.13] your work in that company how is your technology being used so [00:17:53.08] [00:17:53.08] which is a different kind of an ethical conversation right it's less of the what [00:17:58.03] [00:17:58.03] are you building and it's more is now you've got a thing and it has dual uses [00:18:02.00] [00:18:02.00] and has lots of different kinds of uses So the cases we've seen what walkouts [00:18:05.23] [00:18:05.23] is we have Google and project maven so Google in the department defense we have. [00:18:10.13] [00:18:11.17] Sales Force and I we have these other cases and [00:18:16.12] [00:18:16.12] I think the thing to and that's what I thought about what got the experience at [00:18:20.16] [00:18:20.16] i.b.m. as a contractor for the government thinking about. [00:18:25.09] [00:18:25.09] Where and how we should deploy technology but also understand the complexities [00:18:31.05] [00:18:31.05] of it because on one hand you have people that say don't work with any government [00:18:35.11] [00:18:35.11] on the other hand I'll get to this as well I work in the federal government for [00:18:38.06] [00:18:38.06] years and if you're thinking about public interest tech do you want to world or [00:18:42.13] [00:18:42.13] not over there is no technology in our government and so [00:18:46.01] [00:18:46.01] there's also another argument for we need some of the best technologists working on [00:18:51.10] [00:18:51.10] our military programs our education programs our veterans programs as well and [00:18:56.20] [00:18:56.20] it's a complicated issue that I don't think anyone has it right I know tech is [00:19:01.19] [00:19:01.19] doing some work with public policy and computing. [00:19:06.16] [00:19:07.21] And has done work for some time now in this space but no one has it's easy [00:19:12.05] [00:19:12.05] to reduce that down to don't work for government or work for government or [00:19:15.14] [00:19:15.14] something but it's a really complex issue of how do we use technology to [00:19:21.00] [00:19:21.00] benefit the public good in the government sense so if any of you are working for [00:19:26.04] [00:19:26.04] a government contractor It's a fascinating lesson of all the complexities of [00:19:30.19] [00:19:30.19] why the foundational technology at least in the United States is completely broken [00:19:35.20] [00:19:35.20] from how we do contracting for technology to the type of people that get deployed [00:19:40.15] [00:19:40.15] the Miter person probably has a lot of experience with this of [00:19:43.23] [00:19:43.23] the types of technology and how it gets deployed inside government and [00:19:47.12] [00:19:47.12] how governments manage programs and what it's like to be an engineer in [00:19:51.12] [00:19:51.12] the government setting whether at the city state or federal level. [00:19:54.13] [00:19:55.23] That was i.b.m. came back to Google for a little bit and just have fun and [00:20:00.17] [00:20:00.17] if you want to learn more about this setting a year and [00:20:02.20] [00:20:02.20] a half working in the people operations organization and [00:20:07.10] [00:20:07.10] I highly recommend being a tech person working in and the h.r. [00:20:11.12] [00:20:11.12] system because you've got to learn the inner workings of how a company hires. [00:20:14.23] [00:20:16.12] And I want to deeply believe that any organization whether it's government or [00:20:20.16] [00:20:20.16] private sector the hiring of the people you take in is probably the most [00:20:24.02] [00:20:24.02] critical part of any. [00:20:24.22] [00:20:24.22] Organization and. [00:20:26.02] [00:20:27.11] Got to see again how excited the team was about tracking every move [00:20:32.22] [00:20:32.22] of them slowly to be like we can find out we can do analytics to find out who might [00:20:37.11] [00:20:37.11] leave in a few years and then throw more stock options at them and we want to list [00:20:41.06] [00:20:41.06] of all the people and their likelihood of probability of leaving and we want all [00:20:45.05] [00:20:45.05] these all these numbers and just watching how much in the name of good. [00:20:48.20] [00:20:50.16] A company was hired full teams of engineers and researchers and [00:20:54.19] [00:20:54.19] data scientists to just analyze its own people but [00:20:59.02] [00:20:59.02] again never having the additional conversation of what does that really mean [00:21:03.23] [00:21:03.23] and we're starting to see that now with people studying spotlights [00:21:08.18] [00:21:08.18] on not just hiring algorithms hiring practices and [00:21:12.05] [00:21:12.05] how companies treat their employees and what they track with their employees and [00:21:15.22] [00:21:15.22] we're starting to see more awareness of that now but that was also another great [00:21:19.20] [00:21:19.20] lesson on being inside an organization with a lot of excitement around a piece [00:21:24.23] [00:21:24.23] of technology to do something with very little questioning around it and I think. [00:21:30.07] [00:21:31.14] I think there's definitely this is more for you all somewhere with the way we [00:21:36.13] [00:21:36.13] train our our computer scientists and researchers are people [00:21:42.05] [00:21:42.05] who end up going to industry and what you do when you're in that kind of situation. [00:21:45.05] [00:21:46.19] But that's going and me and I think industry shifting a bit [00:21:50.02] [00:21:50.02] where employees for the most part are recognizing this more but [00:21:54.14] [00:21:54.14] I think that's actually a crux of some of the issues that we're seeing. [00:21:57.14] [00:21:58.16] In the tech industry where there's a lot to get back to there's a lot of excitement [00:22:02.06] [00:22:02.06] about what technology can do in this go ahead and do it without necessarily having [00:22:05.19] [00:22:05.19] a conversation about whether or not we should do it so then the pivot from. [00:22:10.09] [00:22:12.17] Google private sector of the world back in 2013 [00:22:18.10] [00:22:18.10] how many of you have heard of healthcare dot gov in the United States. [00:22:22.12] [00:22:23.15] You laugh What did you hear about that so in your and then the summary a few folks [00:22:28.14] [00:22:28.14] with what would your summary of what happen with health care dot gov 6. [00:22:31.21] [00:22:34.06] 6 people and all of the 1st day true story. [00:22:37.04] [00:22:38.10] Or else you know about health care dot gov. [00:22:39.22] [00:22:43.17] And you on. [00:22:44.05] [00:22:47.20] How do you think. [00:22:48.21] [00:22:49.23] Initially. [00:22:50.23] [00:22:52.18] Actually me backtrack when you launch any system how do you think you should [00:22:57.20] [00:22:57.20] what should you what should you put in place to make sure it works and [00:23:01.02] [00:23:01.02] figure out points of failure what do you normally do. [00:23:03.01] [00:23:04.08] You test how do you test that what do you have in place. [00:23:06.02] [00:23:12.02] Usability testing private seeing walk through and testing at all sometimes just [00:23:18.18] [00:23:18.18] some testing would you put tracking on the system any kind of so health care not got. [00:23:26.17] [00:23:30.02] There I believe over 60 so a total of maybe over 60 contractors when you [00:23:35.00] [00:23:35.00] came down to was one make main contractor and a bunch of. [00:23:38.07] [00:23:39.19] A subcontractors after them and. [00:23:43.05] [00:23:45.06] When asked so the website is how do you know if it's working [00:23:49.08] [00:23:49.08] like are we open a browser and we go to. [00:23:52.00] [00:23:53.16] How could I go and we see this up and [00:23:56.14] [00:23:56.14] then we turn of the news and see how many people are talking about it. [00:24:00.17] [00:24:02.05] That's terrifying this is one of the largest. [00:24:04.14] [00:24:06.16] And the should of regardless of where your politics you know this is one of [00:24:10.00] [00:24:10.00] the largest policy or initiatives of an administration and [00:24:13.19] [00:24:13.19] it was going to fail because of technology because a government couldn't figure [00:24:18.16] [00:24:18.16] out how to buy technology it couldn't figure out how to manage the tech team. [00:24:23.03] [00:24:25.00] And and didn't do any and so I remember on the last David for [00:24:30.16] [00:24:30.16] President Obama transitioned out he was with us talking about health care and he [00:24:34.12] [00:24:34.12] said I remember that meeting with my chief of staff and deputy chief of staff and [00:24:38.12] [00:24:38.12] I asked everyone the room are we green light everything as everything got and [00:24:43.05] [00:24:43.05] everyone said yes everything is green all the contractors that they've done [00:24:46.06] [00:24:46.06] all their testing and everything is greenlit friendly phone but but [00:24:50.17] [00:24:50.17] they never actually tested the whole thing together. [00:24:52.16] [00:24:53.22] Because no one thought to do that and that is also the way the nature of which [00:24:57.16] [00:24:57.16] we buy things in the federal government those you who've worked in the federal [00:25:00.06] [00:25:00.06] government will know this it's a pretty common story for pretty much every single [00:25:03.17] [00:25:03.17] contractor whether it's your visa process or your green card process or [00:25:10.03] [00:25:10.03] signing up for benefits of the state level or signing up for health care. [00:25:13.05] [00:25:14.21] And they just didn't test it and they didn't have any track so [00:25:20.10] [00:25:20.10] there was a team of my former coworkers who came out and [00:25:24.10] [00:25:24.10] were like we're just going to spend some time doing this because we know how to [00:25:27.21] [00:25:27.21] keep a really big system up the big system with Google search so google search goes [00:25:32.10] [00:25:32.10] on all the time you don't feel it go down because there's a team of dive ops and [00:25:36.21] [00:25:36.21] authorities and others who are really really good at keeping sites up so [00:25:43.06] [00:25:43.06] that was the start of this movement in the federal government United States. [00:25:46.07] [00:25:47.19] That I would call the public one aspect of public interest tech. [00:25:51.23] [00:25:53.00] Where there were these engineers who knew how to engineer or design or [00:25:57.12] [00:25:57.12] product manage and were like my government is on fire. [00:26:02.09] [00:26:02.09] I have to go do something because my government is on fire and [00:26:05.11] [00:26:05.11] I have the skill set and normal and so the Department of Defense has actually no [00:26:10.08] [00:26:10.08] software engineering title in its job database [00:26:14.19] [00:26:14.19] they just don't our government doesn't have roles for engineers and designers and [00:26:19.08] [00:26:19.08] product managers because that's not what historically what governments do and [00:26:22.22] [00:26:22.22] they also buy everything outside but when you only buy things and [00:26:26.11] [00:26:26.11] you have their expertise and how some manager project you end up getting things [00:26:30.10] [00:26:30.10] that fail over and over again because you don't know what to buy and [00:26:32.22] [00:26:32.22] you don't know how to manage it when you buy it and I'll get to why this is so [00:26:35.21] [00:26:35.21] important relevant to you all as well. [00:26:37.15] [00:26:39.06] And one of the secretaries [00:26:42.01] [00:26:42.01] former secretaries in the tech to carry the United States government [00:26:45.01] [00:26:45.01] is just the head of an agency the term can be misleading. [00:26:47.14] [00:26:49.12] He joked that if we ask anyone to build a concrete ship [00:26:53.15] [00:26:53.15] that won't be very useful and that may ship. [00:26:56.05] [00:26:57.09] May be more likely to think or [00:27:00.15] [00:27:00.15] not be used very often we actually have a fleet of concrete ships in the military. [00:27:04.13] [00:27:05.22] Someone will build it because we're paying them a lot of money to do it and [00:27:08.20] [00:27:08.20] if we say we have $10000000.00 It takes $3000000.00 they will find a way to make [00:27:12.13] [00:27:12.13] a 10000000 dollars That is just how things work in the United States so you ask for [00:27:17.01] [00:27:17.01] something and people will build it exactly how you want it and it might not work but [00:27:20.07] [00:27:20.07] people will build it and there's no one in house until recently who can say. [00:27:25.00] [00:27:26.07] That's not actually how technology works and [00:27:29.07] [00:27:29.07] maybe you should revisit because there just wasn't anyone around. [00:27:33.06] [00:27:34.20] So that's the story of one part of the story of how u.s.g.s. [00:27:39.12] [00:27:39.12] got started there were people probably for 4 or 5 years leading up to the health [00:27:43.18] [00:27:43.18] cannot go scenario this is a much bigger conversation about [00:27:46.15] [00:27:46.15] movement building in general but in the fall of ministration who are trying to [00:27:50.04] [00:27:50.04] build up some kind of tech or digital service being in government for years and [00:27:54.14] [00:27:54.14] they kept trying they tried to throw the fellowship program or [00:27:59.12] [00:27:59.12] maybe Smalltalk programs and they kept trying and [00:28:03.11] [00:28:03.11] they learned how the government worked and they were like prepped and [00:28:07.04] [00:28:07.04] ready for this giant fiasco to happen and this giant fiasco happened and [00:28:12.03] [00:28:12.03] as with any major disaster you get like this clear one way to kind of do. [00:28:16.18] [00:28:18.14] Not whatever you want to clear run away to just kind of bring in people really fast [00:28:22.10] [00:28:22.10] because it had to be done. [00:28:23.15] [00:28:24.17] So we pulled in people who deeply knew the bureaucracy of government [00:28:27.22] [00:28:27.22] people who've been in government for a long time career bureaucrats plus. [00:28:32.13] [00:28:32.13] A bunch of private sector slash people from. [00:28:35.18] [00:28:36.23] The Silicon Valley and other tech companies under said technology [00:28:39.20] [00:28:39.20] to switch not just to save a life healthcare dot gov We hate saying we saved [00:28:44.17] [00:28:44.17] it which is like the headline you'll see in time in Fast Company in Wired Because I [00:28:48.21] [00:28:48.21] feel like no engineer would say that like that thing was saved is still a total [00:28:53.23] [00:28:53.23] cluster of mangled software that is held together and [00:28:58.08] [00:28:58.08] every single year we have a team back out there during open enrollment [00:29:02.03] [00:29:02.03] because it might fall apart again because it's so fragile [00:29:05.10] [00:29:05.10] because the even though it's not that old it's built on a pretty fragile system. [00:29:09.07] [00:29:11.10] And so this is movement now and right now you so [00:29:15.12] [00:29:15.12] we grew that to about 200 people in the federal government. [00:29:20.10] [00:29:21.13] And then states started copying the model other countries now have that model we [00:29:26.03] [00:29:26.03] have these get togethers would be call it like civic tech nerds unite where us and [00:29:32.01] [00:29:32.01] the u.k. and New Zealand and Australia and Taiwan the and. [00:29:35.15] [00:29:36.16] Estonia and a bunch of other country Canada all get together and we're all [00:29:41.13] [00:29:41.13] fighting the exact same problems with our governments where they don't know how to [00:29:44.01] [00:29:44.01] buy technology no one's out to do evil they just really are terrible at doing. [00:29:48.02] [00:29:49.04] Hiring people to do technology and and so [00:29:52.06] [00:29:52.06] we're in this moment now as you've seen with broken tech systems and [00:29:56.06] [00:29:56.06] our government has occurred we're hearing just what everyone uses now but [00:30:00.02] [00:30:00.02] not does occur hearings of hearings at all tech companies are not equipped even [00:30:05.02] [00:30:05.02] if we have regulation which I think we should to some extent we are not equipped [00:30:09.06] [00:30:09.06] to do to really enforce our regulation because we don't have the people around to [00:30:14.02] [00:30:14.02] understand the nuances of if you're if you say something like the bias or [00:30:17.23] [00:30:17.23] system in some rule what does that even mean on the technical level [00:30:22.15] [00:30:22.15] people don't know enough in our government and I also think. [00:30:27.06] [00:30:27.06] That's where schools that have public policy programs and faculty engage across [00:30:31.15] [00:30:31.15] both sides can really really make a significant difference no one is leading [00:30:35.08] [00:30:35.08] and that right now Harvard I teach a class at Harvard Harvard is trying Harvard [00:30:39.21] [00:30:39.21] also it's not leading in it Princeton has some great programs Ed Felten but for [00:30:44.12] [00:30:44.12] the most part this area for [00:30:47.06] [00:30:47.06] thinking about the direction innovation is so ripe for figuring out. [00:30:51.10] [00:30:52.11] How to get the right people and it is a space of technology and [00:30:57.01] [00:30:57.01] policy in figuring out how to regulate something plus the space of bringing [00:31:01.23] [00:31:01.23] technology into the civic sector to make sure that we don't have more [00:31:05.21] [00:31:05.21] health care dot gov scenarios there are so many versions of. [00:31:11.12] [00:31:11.12] The story on a much smaller scale there is an example of [00:31:14.22] [00:31:14.22] immigration where there was a contract year contractor who was tasked with [00:31:19.18] [00:31:19.18] building a system that makes it easier for you to figure out how to apply for. [00:31:24.03] [00:31:27.03] Citizenship for a family member how many you've had to do that or does apply for [00:31:30.19] [00:31:30.19] citizenship in general it's a complicated it's a complicated and [00:31:34.21] [00:31:34.21] sometimes people wait for up to a decade it's really really really complicated so [00:31:39.08] [00:31:39.08] there's a contractor that was hired to do that after I think 6 or so [00:31:42.14] [00:31:42.14] years they completely failed and then they won the bid to fix [00:31:47.20] [00:31:47.20] the failure this is just how the government the u.s. works and [00:31:52.21] [00:31:52.21] so the big takeaway they are with that is we need more people understand technology [00:31:58.16] [00:31:58.16] in our government everywhere not just the u.s. all over the world because we're [00:32:02.19] [00:32:02.19] no longer in a world where government just as government things government and [00:32:06.22] [00:32:06.22] tech are just now in or weave together whether they like it. [00:32:10.18] [00:32:11.22] Or not so. [00:32:13.09] [00:32:14.22] That's the U.S.D.A.'s side of we have to [00:32:19.19] [00:32:19.19] have people who understand technology working at all parts of our government but [00:32:24.12] [00:32:24.12] then there's also the very complex going back to attack and [00:32:26.20] [00:32:26.20] I think conversation it's actually quite frustrating for me to be around some [00:32:31.13] [00:32:31.13] back in like my tech circles sometimes where people say things like. [00:32:34.19] [00:32:36.07] We have to boycott a company because it can't work for the government at all. [00:32:40.17] [00:32:41.23] And to some extent. [00:32:44.01] [00:32:45.02] Yes some of the some of the things that. [00:32:46.17] [00:32:48.14] Some of the things that some of these government agencies are doing are really [00:32:51.10] [00:32:51.10] really terrible and we should think about how to not enough those are these [00:32:56.04] [00:32:56.04] are issues that we as us have had to deal with as well which products we take on and [00:32:59.23] [00:32:59.23] which we don't and I bring this up because I think it's critical for [00:33:04.08] [00:33:04.08] academic institutions to talk about this as a weird line between politics or [00:33:08.21] [00:33:08.21] not politics but the end of the day when we build technology it affects people in [00:33:13.10] [00:33:13.10] it because it does become political If your technology is being used to to be [00:33:18.17] [00:33:18.17] behind the data system that tracks humans for a particular purpose for example [00:33:24.02] [00:33:24.02] that this is a conversation that we're going to have to teach our students to [00:33:26.16] [00:33:26.16] have at it might not be to tell them which side they're on but it's a conversation [00:33:32.07] [00:33:32.07] that we have to figure out how to have our students really meaningfully engaging and [00:33:37.05] [00:33:37.05] I think part of the issue we're seeing with some of these companies is. [00:33:39.23] [00:33:40.23] We're not equipped to have those cards to even know which other field and [00:33:44.13] [00:33:44.13] expert to bring into the room to help us how these meaningful conversations [00:33:47.22] [00:33:47.22] is the reason why one of the so at the great copper conference this year [00:33:52.16] [00:33:52.16] somebody saw that Palin here was a sponsor. [00:33:55.08] [00:33:56.08] And whole group got really really upset and [00:33:59.01] [00:33:59.01] they said we have to about boycott Palin tear because of ice period and [00:34:03.01] [00:34:03.01] there's an uproar and then Grace Hopper pulled Palin to or from. [00:34:06.22] [00:34:08.11] The list of sponsors and people congratulate themselves and were so [00:34:13.00] [00:34:13.00] excited while the person that was the leader of the Grace Hopper [00:34:18.19] [00:34:18.19] conference work for Microsoft and their Google was still a sponsor and [00:34:24.13] [00:34:24.13] we have to really I feel so strongly that we have to figure out not just in our our [00:34:29.14] [00:34:29.14] our companies but with our students these are complicated. [00:34:34.12] [00:34:34.12] Issues that lots of nuance to that that involve politics and society and [00:34:40.02] [00:34:40.02] ethics that are technologists oftentimes are not equipped to have and [00:34:44.12] [00:34:44.12] therefore it's easy to make these statements of this is bad and [00:34:47.09] [00:34:47.09] this is good while also ignoring the complexity of the situation because if [00:34:51.06] [00:34:51.06] you're going to bark up boycott apology or you don't have a particular framework for [00:34:55.02] [00:34:55.02] how you deal with issues of sponsorship and who you take funding from you're still [00:34:59.20] [00:34:59.20] taking funding from other people who are problematic in the exact same reason [00:35:05.02] [00:35:05.02] why you boycott of this other company but you would know you're able to ignore it [00:35:09.10] [00:35:09.10] because you don't have a framework for recognizing this in a broader steps and [00:35:14.07] [00:35:14.07] you don't have a decision making ability to do that and [00:35:17.23] [00:35:17.23] I actually I think at least in the tech space many c.s. programs don't. [00:35:21.23] [00:35:23.01] Teach us to do that and even to know which other fields to bring in [00:35:27.07] [00:35:27.07] our is it the Political Science is the social sciences is it the race and [00:35:30.18] [00:35:30.18] gender scholars. [00:35:31.17] [00:35:33.10] Going to community members like what equipping us to make to have these [00:35:38.04] [00:35:38.04] conversations in our organizations and for those of you who have worked for [00:35:41.17] [00:35:41.17] the big tech companies when we're making decisions about [00:35:44.22] [00:35:44.22] where to deploy I what to sell to whom are we serving. [00:35:47.15] [00:35:49.02] It might be different in the a.c.i. [00:35:50.11] [00:35:50.11] sense because that's what you all do at least in the computing. [00:35:53.08] [00:35:54.11] But computer science degree we don't talk about that and nuff So I feel like that's [00:35:59.15] [00:35:59.15] those 2 thought that's a multiple not just 2 sides of the civic tech space we [00:36:04.17] [00:36:04.17] have the yes we half we have I believe that as technologists we have to go and [00:36:10.14] [00:36:10.14] work in the because if you think if you care about public service the government [00:36:13.22] [00:36:13.22] is the biggest public service in most cases of any place you go to. [00:36:17.23] [00:36:19.11] So we have to care about that but also if you're in a private sector company and [00:36:24.08] [00:36:24.08] you're having the companies are having these discussions about what is ethics and [00:36:29.05] [00:36:29.05] where you stand there has to be some kind of framework to help your [00:36:32.09] [00:36:32.09] employees understand. [00:36:33.17] [00:36:34.18] What that really means and and [00:36:37.22] [00:36:37.22] it starts early at least in our education and I don't think we have that and nuff. [00:36:43.09] [00:36:44.19] Which now leads me to. [00:36:46.14] [00:36:47.14] Something else that I'm working on which is called the responsible [00:36:50.19] [00:36:50.19] Computer Science Challenge which Ellen is part of. [00:36:52.20] [00:36:54.01] There is this initiative which was started by many our network how many folks [00:36:59.09] [00:36:59.09] know of Miti our media our network many so p.r. middy I was on [00:37:04.09] [00:37:04.09] the founders of e-bay and has since then moved to Hawaii with his family and [00:37:09.08] [00:37:09.08] has a strong Norma's fund that he puts in what he called Impact Investing and [00:37:13.08] [00:37:13.08] you can define You can go to the see their portfolio what impact investing means and [00:37:19.18] [00:37:19.18] they decided a few years ago that they really wanted to tackle the space of how [00:37:24.15] [00:37:24.15] we educate our future technologists whatever that means of and [00:37:29.09] [00:37:29.09] over a few iterations where they landed was well the core c.s. [00:37:33.11] [00:37:33.11] people aren't necessarily paying attention and nuff So we will put a lot of money. [00:37:37.13] [00:37:38.18] Into computer science undergraduate education very narrowly recognizing that [00:37:44.19] [00:37:44.19] in the us recognizing that we're going to miss the international perspective and [00:37:48.03] [00:37:48.03] we're going to miss but I school which I've been doing this work for forever. [00:37:52.10] [00:37:53.15] And recognize the many people who will recognize new programs but [00:37:58.17] [00:37:58.17] also some of the people who are doing this work for a long time for example I know at [00:38:03.05] [00:38:03.05] least here to attack correct me if I'm wrong and Ellen that you Amy Brenneman. [00:38:09.14] [00:38:10.16] Even Marc does Alan others have been doing work in what we are now calling [00:38:14.23] [00:38:14.23] responsible c.s.r. ethics and c.s. for over a decade even. [00:38:20.03] [00:38:21.06] And this recognition that for whatever reason the rest of computer science just. [00:38:26.17] [00:38:26.17] Says unless it's algorithms or data structures or theory that's not real. [00:38:30.09] [00:38:31.19] And which is ridiculous and I felt it when I was here not just [00:38:37.02] [00:38:37.02] those other fields aren't real but then also other disciplines aren't real [00:38:40.08] [00:38:40.08] where you hear all the time that at least on the undergraduate level if you're in [00:38:44.02] [00:38:44.02] the College of computing that's real and are all the other majors are fluff and [00:38:47.09] [00:38:47.09] that's why they have time to go to football games. [00:38:49.02] [00:38:51.03] Which is also absolutely ridiculous but we create and then we create [00:38:56.00] [00:38:56.00] this hierarchy of majors right that definitely exists here at Georgia Tech. [00:39:00.09] [00:39:02.09] But somebody pointed me to how are you familiar with j.p. [00:39:04.20] [00:39:04.20] Snow's 2 cultures as an essay that was written a while ago. [00:39:07.09] [00:39:09.03] How would you summarize 2 cultures if you. [00:39:11.11] [00:39:15.21] Want to summarize 2 cultures Yeah. [00:39:17.13] [00:39:27.18] Yeah and there's a line there I think I'm going to summarize but [00:39:29.23] [00:39:29.23] he's like you know how of the physicist looks like how do you know under the laws [00:39:34.11] [00:39:34.11] of thermodynamics You must be an idiot and you haven't the English person let's like [00:39:39.01] [00:39:39.01] how you're not read Shakespeare you must be an idiot and this idea that there are. [00:39:44.23] [00:39:46.09] Just these different cultures and we don't push them and I think especially in [00:39:50.03] [00:39:50.03] computer science if you are not the computer scientist and you haven't done [00:39:55.05] [00:39:55.05] the coding then the algorithms then you're just not smart anywhere and that's so [00:39:59.17] [00:39:59.17] critical because the goal I remember is he feeling that at Georgia Tech but [00:40:04.16] [00:40:04.16] then when we go into tech companies. [00:40:06.11] [00:40:07.18] We have the undergraduate computer science students easily making 6 [00:40:11.00] [00:40:11.00] figures of the most valued. [00:40:12.07] [00:40:13.09] The most valued employees by money terms in the company you're told over and [00:40:17.23] [00:40:17.23] over over again that you need more of you. [00:40:19.18] [00:40:21.02] And and there's even something that was forever [00:40:26.06] [00:40:26.06] keep with me after working in the h.r. department was that people were you [00:40:31.06] [00:40:31.06] would hear people say by someone of the engineer going on in Geneva because people [00:40:34.07] [00:40:34.07] were to say things like well are they an engineer but in that even in the hiring [00:40:38.04] [00:40:38.04] system there was a field for either you're either in engineering or not. [00:40:42.05] [00:40:44.18] And that's critical because that goes back to who has power in an organization and [00:40:50.09] [00:40:50.09] when you've decided that your engineering organization strickly engineer or [00:40:54.09] [00:40:54.09] his agent has the most power and everything else whether it's usability or [00:41:00.03] [00:41:00.03] as apology or ethnography or social science or philosophy is the 2nd [00:41:05.02] [00:41:05.02] then all those perspectives are 2nd to the technical infrastructure of your system. [00:41:09.09] [00:41:11.07] And I think that has a really really deep role in some of the broken technology [00:41:16.06] [00:41:16.06] systems that we've built they're built to scales they're built to reach as many [00:41:20.01] [00:41:20.01] people as possible but not necessarily built to understand which populations [00:41:24.10] [00:41:24.10] are left out to understand the complicated political climate [00:41:28.02] [00:41:28.02] of countries that we deploy it's even in this country and [00:41:33.08] [00:41:33.08] as a part of the recognition with responsible computer science is that we [00:41:38.08] [00:41:38.08] really have to rethink how we train our computer scientist and [00:41:42.17] [00:41:42.17] we have to put money and weight behind recognising that computer science needs [00:41:46.22] [00:41:46.22] this different component in a disciplinary and integrate computer [00:41:51.05] [00:41:51.05] integrate ethics and responsibilities that are out the computer science curriculum. [00:41:55.05] [00:41:56.06] And part of that also is to pull in all the deans of computer science programs to [00:42:00.19] [00:42:00.19] also say hey we as a lot and all of our industry partners we have [00:42:05.19] [00:42:05.19] an issue letter with as to that something and she letters matter what a bunch [00:42:10.06] [00:42:10.06] of industry partners are have signed on to say we really care about this that we're [00:42:14.14] [00:42:14.14] putting money behind this we're paying your researchers to care about this. [00:42:18.07] [00:42:19.08] And we're going to bring them together and hope that it makes some kind of a dent in [00:42:22.06] [00:42:22.06] the space and recognize that as take some responsibility are deeply critical [00:42:26.23] [00:42:26.23] to just all of computing today and we can't ignore them and the more. [00:42:30.16] [00:42:33.02] Yes please we can. [00:42:33.23] [00:42:39.03] You know. [00:42:39.15] [00:42:41.21] Did you feel that way because yes I know so and if anyone is ever to Google you [00:42:45.21] [00:42:45.21] are 2nd class and they treat you that way with her yes if you're out. [00:42:51.06] [00:42:52.07] West. [00:42:52.19] [00:42:54.14] Yep. [00:42:55.02] [00:42:59.18] That's right well throughout. [00:43:01.19] [00:43:06.09] The. [00:43:07.18] [00:43:07.18] Years I will stop there are no I have 4 calls. [00:43:14.23] [00:43:16.15] From people. [00:43:17.21] [00:43:19.02] Who still want one of you. [00:43:23.03] [00:43:24.10] I will show you where you will. [00:43:28.10] [00:43:30.11] You know all. [00:43:32.15] [00:43:34.00] About. [00:43:34.12] [00:43:37.02] Yeah I know I must go also works at the tech companies and [00:43:40.14] [00:43:40.14] gets money from other tech companies as well as an internal conversation that I'm [00:43:44.05] [00:43:44.05] on the result Foundation which to accept that matters he ought gets [00:43:48.20] [00:43:48.20] as if a nonprofit where he gets money from lots of different organizations and [00:43:52.09] [00:43:52.09] it's a constant conversation of where do we get our money. [00:43:55.03] [00:43:56.16] And one of the conditions too was should already are for a founder of e-bay [00:44:03.05] [00:44:03.05] deeply rooted in tech the funding response of computer science and now we go into [00:44:06.09] [00:44:06.09] this are they washing their responsibility and it's a much also bigger conversation [00:44:11.07] [00:44:11.07] we should have at some point maybe not today about power and money and funding in [00:44:16.01] [00:44:16.01] general I'm part of the Berkman Center too much as I was telling some folks and [00:44:20.13] [00:44:20.13] I'll get I'll get back on track in the whole bit which is considered the cousin [00:44:23.13] [00:44:23.13] the mit Media Lab where all the same area we shall all the same people are hauled. [00:44:29.18] [00:44:29.18] Michael can attest to our whole environment was dominated by [00:44:33.01] [00:44:33.01] power money funding and Epstein for probably for September and [00:44:36.23] [00:44:36.23] October and all of that is related. [00:44:39.21] [00:44:41.04] To where money lives who gets the funding who does the research who gets access to [00:44:45.06] [00:44:45.06] the rooms who make the decision for this particular challenge the hope [00:44:50.06] [00:44:50.06] is that and again we're all trying this out there something universities including [00:44:54.06] [00:44:54.06] Georgia Tech and Ellen's work to think about what it means to and they're great [00:44:59.20] [00:44:59.20] at 6 and responsibility and for some folks that means taking the algorithms class and [00:45:05.09] [00:45:05.09] while we're talking about edges the nodes in some graph class [00:45:09.15] [00:45:09.15] also weave in privacy into that class so that you only think about. [00:45:13.22] [00:45:15.22] The technical part but you're also not thinking about it in terms of the broad [00:45:18.14] [00:45:18.14] picture of society but also to your point recognizing that some of us go into [00:45:24.05] [00:45:24.05] engineering computer scientists we think a certain way we like quantifiable things so [00:45:28.12] [00:45:28.12] where is that medium where you can take that mindset because there are benefits of [00:45:33.13] [00:45:33.13] thinking in terms of that way too and applying other ways of thinking to it so [00:45:39.10] [00:45:39.10] that when you're building something with the very binary mindset you can also now. [00:45:44.06] [00:45:45.22] Bring in your social responsibility to that somehow and [00:45:48.22] [00:45:48.22] we're all just trying to figure out what that looks like and [00:45:51.10] [00:45:51.10] a big part of that too and Hilary Cohen who was integral and have [00:45:56.13] [00:45:56.13] any of you seen the new Stanford cos it's the one that is there's a new Stanford [00:46:02.01] [00:46:02.01] class they're also are experimenting that has a computer science professor. [00:46:05.13] [00:46:06.19] A political science and local philosopher they're all men. [00:46:11.10] [00:46:12.10] And Hillary Cohen is there the main researcher on that project and [00:46:18.03] [00:46:18.03] her big thing was. [00:46:20.06] [00:46:20.06] We can't just say we want to live in a world where we [00:46:22.22] [00:46:22.22] train computer scientists to be better and then they still rule the world and [00:46:27.01] [00:46:27.01] every other discipline is still 2nd cost us ridiculous. [00:46:30.13] [00:46:31.21] So in addition to training our computer scientists to better how do we shift [00:46:35.01] [00:46:35.01] the culture to not be just as 2 cultures of we as computer science know best and [00:46:41.03] [00:46:41.03] everyone else maybe your opinion will matter later [00:46:43.19] [00:46:43.19] how do we shift it to be a culture of every time we're making a decision. [00:46:47.16] [00:46:48.23] We have all these perspectives in the field with us in the role with us as well [00:46:52.08] [00:46:52.08] so I'll give you one example and [00:46:54.10] [00:46:54.10] then I'll wrap up with some of the next steps on what I think this field is going [00:46:58.10] [00:46:58.10] I run something called the school talk working group the Berkman Center what that [00:47:01.00] [00:47:01.00] means is off the record I get all sorts of industry people coming by and [00:47:04.16] [00:47:04.16] just feel like we have this really gnarly problem what do you think. [00:47:07.02] [00:47:08.16] There's a right to marry an app and so potentially work for [00:47:12.06] [00:47:12.06] a ride sharing company and you are passengers and [00:47:16.17] [00:47:16.17] maybe even drivers sometimes are complaining and complaining but [00:47:19.12] [00:47:19.12] indicating that they don't feel safe being dropped off in a certain neighborhood [00:47:23.10] [00:47:23.10] of course sometimes you're being dropped off in a place you know because that's [00:47:26.10] [00:47:26.10] where you go but it's really get a new cities are being dropped off somewhere [00:47:29.06] [00:47:29.06] you might not know what neighborhoods like and they're afraid of being unsafe. [00:47:32.07] [00:47:33.13] And the team was like we have a brilliant idea we are going to use [00:47:37.02] [00:47:37.02] Google Street View and run some algorithms on them and the side give a safety [00:47:43.09] [00:47:43.09] score and therefore you're dropped off I'll tell you the safety score. [00:47:46.11] [00:47:47.23] What are some initial reactions to that. [00:47:49.22] [00:47:51.17] Reagan asked. [00:47:52.17] [00:47:55.05] But actually I have quite a few roughly the equipment is available because it's [00:47:59.20] [00:47:59.20] use that there's there's a precedent of people using Google Street View for thing. [00:48:04.19] [00:48:06.03] And in. [00:48:06.21] [00:48:08.15] Some. [00:48:09.03] [00:48:12.03] You know so what even so what goes into a score and if you have only a team [00:48:15.10] [00:48:15.10] of engineers that determines what goes on the score is probably problematic so [00:48:18.13] [00:48:18.13] we were in this room with some race and gender scholars a lot like [00:48:22.08] [00:48:22.08] a couple lawyers a few computer scientist an anthropologist and [00:48:26.21] [00:48:26.21] immediately within 10 minutes this person was sitting there and [00:48:29.16] [00:48:29.16] really could all have you heard the broken windows theory and [00:48:33.04] [00:48:33.04] most of your scientists have not it's a theory that you could go and [00:48:36.15] [00:48:36.15] judge the safety of neighborhood by how many broken windows you have and [00:48:39.18] [00:48:39.18] it's like been long the bunk for being terrible and [00:48:42.21] [00:48:42.21] you're like let's think about that but you can't judge a neighborhood by just looking [00:48:45.11] [00:48:45.11] at it and between like the lawyers and all these people in there within 10 minutes [00:48:50.04] [00:48:50.04] they try breaking down why this might be problematic and [00:48:52.16] [00:48:52.16] finally this woman Jasmine McNeely who's a law professor at University of Florida [00:48:57.04] [00:48:57.04] was like What have you heard of safety or have you thought of safety for whom and [00:49:01.08] [00:49:01.08] this engineer is person from this Russian company what do you mean safety for home. [00:49:05.07] [00:49:06.19] And complete and this I think that question alone completely shifted how they [00:49:09.22] [00:49:09.22] thought about the problem because they're like well some people might feel safe [00:49:13.01] [00:49:13.01] being dropped off in Peachtree City some people won't [00:49:15.11] [00:49:15.11] some people will feel safe being dropped off and not on a lot of people wall so [00:49:18.19] [00:49:18.19] how are you going to term a safety for whom and they hadn't ever really even [00:49:23.07] [00:49:23.07] thought of that and that sure you can really train our engineers to better [00:49:27.07] [00:49:27.07] curriculum to think better but also there's such a benefit of having these [00:49:31.20] [00:49:31.20] different perspectives in the room to just all the normal like not even bring [00:49:36.12] [00:49:36.12] them a consultant like they're just around you all the time where you come up and [00:49:39.10] [00:49:39.10] idea in the community challenge all the time with all these [00:49:43.12] [00:49:43.12] deep deep knowledge they come because that's their craft on these are all these [00:49:47.18] [00:49:47.18] other people's roles are constantly think about the societal problems so as we think [00:49:52.05] [00:49:52.05] about this part of the response of c.s. challenge is yes retrain r.c.s. [00:49:56.15] [00:49:56.15] people but how do we think about shifting computer science so [00:50:00.15] [00:50:00.15] that these other perspectives are deeply valued and we're thinking about that [00:50:04.03] [00:50:04.03] with industry too but [00:50:05.11] [00:50:05.11] how teams are structured because Ms Olson this weird spot of being part foundation. [00:50:10.17] [00:50:10.17] Part funder part industry because it also makes products like Firefox so [00:50:15.10] [00:50:15.10] how do we shift industry to really change. [00:50:17.16] [00:50:18.20] What that looks like. [00:50:19.17] [00:50:21.06] Yet so I right now a bunch of companies have rolled open for [00:50:25.22] [00:50:25.22] like some kind of response so you can probably find them but they're all trying [00:50:30.17] [00:50:30.17] to hire new chief ethical officers or more responsible engineers or people for [00:50:36.07] [00:50:36.07] the new response and evasion board so if you're interested in that feels for [00:50:40.20] [00:50:40.20] direct one we don't have enough students graduating with the skill set [00:50:45.06] [00:50:45.06] while companies need more students so it's this that we're all trying to figure this [00:50:49.12] [00:50:49.12] out together both from the academic side and from the industry side. [00:50:52.21] [00:50:54.02] They are basically just want people who do not exist so [00:50:56.21] [00:50:56.21] if this is your area of interest you will certain find a home somewhere and [00:51:02.11] [00:51:02.11] there's also a big movement of figuring out [00:51:05.07] [00:51:05.07] how do we are technologists to work in our government and some kind of a response [00:51:10.07] [00:51:10.07] a way to make sure our government services do not failed for the citizens. [00:51:13.16] [00:51:15.07] And with that I'll take I don't know if I'll take questions on all of this. [00:51:21.01] [00:51:23.00] Thank. [00:51:26.15] [00:51:31.04] You. [00:51:31.16] [00:51:37.19] So. [00:51:38.07] [00:51:48.20] Much. [00:51:49.08] [00:51:56.11] I think that's where from what I've seen us advice here the tech companies and [00:52:00.17] [00:52:00.17] I'm part of some of these conversations. [00:52:02.10] [00:52:04.06] I hate the term at the pen but it depends on some of these companies I won't name [00:52:07.18] [00:52:07.18] them but there's a certain level of management and [00:52:10.04] [00:52:10.04] it's usually like the director level not the v.p. or senior v.p. or c.e.o. [00:52:15.03] [00:52:15.03] because sometimes they have a buy evidence of some of their public conversations [00:52:20.14] [00:52:20.14] I don't want that but there's nothing with the director or slash or v.p. [00:52:24.09] [00:52:24.09] level that deeply want those employees on their on their teams. [00:52:28.20] [00:52:29.23] And so interview for those teams they've searched they're looking to change or [00:52:33.06] [00:52:33.06] interview questions like no one has it right now they're a cop hiring people [00:52:36.19] [00:52:36.19] to help them change their interview questions to hire more scenes and [00:52:39.16] [00:52:39.16] think about these issues and then yes there are still other teams a 100 percent [00:52:43.22] [00:52:43.22] don't want people like that too so there's no uniformity right now. [00:52:48.22] [00:52:50.09] And at some point they're going to hit an issue where because I think the lever [00:52:54.19] [00:52:54.19] we have in computer science is are still sets are valuable and [00:52:58.00] [00:52:58.00] if people stop work wanting to go work for a company [00:53:00.13] [00:53:00.13] then they're going to have to change how they hire We're not quite there yet [00:53:03.15] [00:53:03.15] because people are still going to want to go work for Facebook and Google on their [00:53:07.05] [00:53:07.05] I mean I also recognize the privilege of deciding you don't want to work for [00:53:10.06] [00:53:10.06] Facebook for Google so where at 3 in the middle of it changing right now but [00:53:14.06] [00:53:14.06] it just really depends on the team. [00:53:15.15] [00:53:22.02] And the students to yeah. [00:53:23.08] [00:53:24.12] So. [00:53:25.00] [00:53:27.05] With the green. [00:53:28.13] [00:53:36.06] You and your year. [00:53:37.14] [00:53:38.20] There was a whole bunch of Germans I'm right yet [00:53:41.07] [00:53:41.07] build the technology and yes say yes. [00:53:44.10] [00:53:46.01] And I. [00:53:46.21] [00:53:48.11] Hear. [00:53:48.23] [00:53:51.14] What the. [00:53:53.00] [00:53:54.20] Yes reaction. [00:53:55.19] [00:53:58.01] Will be. [00:53:58.19] [00:54:00.06] Like yes yes yes. [00:54:04.01] [00:54:07.13] Yes. [00:54:08.01] [00:54:10.22] You're you're. [00:54:12.05] [00:54:16.13] Great actors. [00:54:17.06] [00:54:19.15] And we've. [00:54:20.04] [00:54:21.20] Yeah so my theories and I don't know if any of this will work. [00:54:25.08] [00:54:26.15] As I do think as a narrative point if you were to survey and [00:54:30.14] [00:54:30.14] if we've seen this at some of the surveys that some of these universities have done [00:54:34.03] [00:54:34.03] if you survey students on that level of commitment to responsibility to some part [00:54:38.21] [00:54:38.21] of our own like or like our not wanting to do something terrible [00:54:42.01] [00:54:42.01] like arguably 10 years ago people still were like We only want to do good that's [00:54:45.05] [00:54:45.05] why we're in computer science but when it does to your point when it comes down to [00:54:50.08] [00:54:50.08] the unintended consequences or even a big thing I get from the engineers and [00:54:54.05] [00:54:54.05] students are on our day to day yeah we get that bias is bad like we get [00:54:59.07] [00:54:59.07] that having our Pop 4 propagate bad news that eventually the genocide [00:55:04.12] [00:55:04.12] is bad we get that i.b.m. the Holocaust is bad they are telling us that's bad but [00:55:08.20] [00:55:08.20] was not look like what I'm an engineer day to day and I'm building software and [00:55:12.14] [00:55:12.14] I'm making these decisions what I'm picking the language to use when I'm [00:55:16.15] [00:55:16.15] picking what kind of data structures what I'm picking where to store my data [00:55:20.14] [00:55:20.14] when I'm making all these decisions like and [00:55:22.17] [00:55:22.17] where does that work and I think part of it is helping our students think [00:55:27.21] [00:55:27.21] critically at all large steps in at that lol Of enough of a level plus. [00:55:34.18] [00:55:36.05] We have things in task like post-mortems and [00:55:39.11] [00:55:39.11] we have way where we analyze problems where we look at case studies and [00:55:45.06] [00:55:45.06] we there's a postmortem for pretty much any Kotex failure in a company so [00:55:50.04] [00:55:50.04] how do we have crazy perhaps the same thing even though I know it's much harder [00:55:53.10] [00:55:53.10] because I'm sure qualitative isn't like at a point in time Google search went down so [00:55:57.22] [00:55:57.22] therefore we have to figure out what happens now we're thinking of much more [00:56:00.17] [00:56:00.17] broader society problems but perhaps thinking of ways to do post-mortems or [00:56:04.12] [00:56:04.12] something similar where it's normal cadence to review what went wrong and [00:56:09.03] [00:56:09.03] then just not do it again and get rewarded for it there's like the story of [00:56:13.07] [00:56:13.07] the person who brought the Google search and got here bonus and [00:56:15.18] [00:56:15.18] that they'd love to tell that story because it's a thing that happens. [00:56:18.12] [00:56:19.23] And to say that you can raise of an issue and still get a bonus but [00:56:25.02] [00:56:25.02] obviously with the walk out that's not true though you can raise that time there [00:56:28.07] [00:56:28.07] are certain issues you can raise and get a bonus for which is the tech issue you [00:56:31.03] [00:56:31.03] raise other issues and you might be pushed out of organization so how do we change [00:56:34.14] [00:56:34.14] organizations so that students and people can speak up so I think we need all those. [00:56:38.14] [00:56:39.18] Parts and yes yes see that night there. [00:56:43.20] [00:56:54.22] You know we should. [00:56:55.18] [00:56:57.22] Be able. [00:56:59.12] [00:57:03.01] Yeah. [00:57:03.13] [00:57:11.04] Well. [00:57:11.16] [00:57:14.04] You know what. [00:57:14.22] [00:57:19.09] Yeah I had a 100 percent see room for government and [00:57:24.05] [00:57:24.05] the space even hard to get an example it's not perfect there [00:57:27.19] [00:57:27.19] actually is a joke joke joke joke by lead to Lars at the moment t.v. [00:57:31.12] [00:57:31.12] I drop all the lawyers are so excited because now they knew they were good and [00:57:34.21] [00:57:34.21] they're all in these jobs of like 4 companies fighting chiefs are. [00:57:38.14] [00:57:39.23] Which is terrifying I guess. [00:57:41.10] [00:57:42.10] But I I think a dream world and a lot of folks and Company was [00:57:46.19] [00:57:46.19] a dream world would be for a governing body that deeply understood technology [00:57:51.17] [00:57:51.17] in the same way that the engineers in these companies understood technology and [00:57:56.02] [00:57:56.02] they put the same kind of things that they made that they would do if they were to [00:57:58.16] [00:57:58.16] self govern into law and government but we don't have that right now so [00:58:02.19] [00:58:02.19] if there was a way if we had people in government who understood why is it that [00:58:07.16] [00:58:07.16] algorithm has discrimination versus just that algorithms bias do something [00:58:12.15] [00:58:12.15] about it what takes between that is not just the data input there's like [00:58:17.06] [00:58:17.06] parts of the algorithm that you're making decisions on your parameters to but [00:58:20.03] [00:58:20.03] to understand the nuances of what that looks like looks like and [00:58:23.21] [00:58:23.21] then you make the regulation based on that and [00:58:26.11] [00:58:26.11] forcing the regulation to sizing government should have a role but [00:58:31.12] [00:58:31.12] if we were to do that today we're not equipped to do that so [00:58:35.12] [00:58:35.12] that's terrifying because we're going to have a bunch of rolls so [00:58:38.19] [00:58:38.19] I'll get share when I get all but what that can look like too there's [00:58:41.14] [00:58:41.14] something called How many of you know about the paperwork Reduction Act. [00:58:43.21] [00:58:44.21] There's a thing called the paperwork Reduction Act in the United States I'll [00:58:48.01] [00:58:48.01] probably do a terrible job and not a lawyer policy person over there but [00:58:53.05] [00:58:53.05] the general idea is meant that the government can't keep asking you for [00:58:56.06] [00:58:56.06] the same information over and over and over and over again. [00:58:58.06] [00:59:00.00] And when we 1st came to the United States or a service and were in the all these [00:59:03.19] [00:59:03.19] different agencies building out different tools we're like you can't go and [00:59:06.16] [00:59:06.16] talk to users or people because it's illegal and like why is it illegal why can [00:59:10.13] [00:59:10.13] I go talk to what you see is because of the paper Protection Act it's own legal. [00:59:14.01] [00:59:15.01] So what happened was as a scroll called the Protection Act And [00:59:17.15] [00:59:17.15] over time a bunch of people who didn't really understand what that meant just [00:59:21.15] [00:59:21.15] chalk that up to you can't go talk to people at all. [00:59:24.10] [00:59:25.14] Including users of authentic so you can't go interview veterans just if you want [00:59:30.15] [00:59:30.15] to you can't go in and interview students who might be using the fast reform and [00:59:34.20] [00:59:34.20] that is a risk of any time a government that is not understand technology or [00:59:40.17] [00:59:40.17] anything else governing makes a role we end up with things like that. [00:59:44.18] [00:59:46.21] Were really up. [00:59:47.20] [00:59:49.20] For it. [00:59:51.02] [00:59:55.04] You know one on one where you will. [00:59:58.19] [01:00:00.06] Hear I mean I think a lot of. [01:00:02.17] [01:00:03.22] What I'm. [01:00:04.10] [01:00:06.02] Like about what. [01:00:06.14] [01:00:09.06] Was once. [01:00:09.22] [01:00:11.08] You know. [01:00:11.20] [01:00:13.02] You asked for one. [01:00:14.02] [01:00:16.11] I'll show that I would also love our own spots too because I was part of that there [01:00:20.22] [01:00:20.22] was a really interesting approach is I think it's less to be so well and actually [01:00:26.16] [01:00:26.16] bought up learning outcomes with a group of while ago because she was like there's [01:00:30.13] [01:00:30.13] always stuff going on like what are the actual learning outcomes and I think it's [01:00:33.19] [01:00:33.19] going to take us a while to even realize that the learning outcomes are realized. [01:00:37.11] [01:00:38.14] I'll use August an example which I really like the example of its content but [01:00:43.03] [01:00:43.03] it was taking just core. [01:00:44.16] [01:00:44.16] Computer science work and weaving and like weaving in how nodes and [01:00:49.07] [01:00:49.07] graph connect to privacy and I think those are things that [01:00:53.23] [01:00:53.23] it just it makes you really just internalize what that looks like but [01:00:58.18] [01:00:58.18] there are other examples where people are pairing social scientists and [01:01:02.02] [01:01:02.02] philosophers and having them come into the classroom with that's more kind of like [01:01:05.02] [01:01:05.02] a parachuting in Sol Ellen as you know is doing a role playing game [01:01:09.19] [01:01:09.19] are there others that stand out feeling that you want to share with. [01:01:12.14] [01:01:14.11] You that. [01:01:15.02] [01:01:18.15] There are a lot of. [01:01:20.06] [01:01:22.20] Work. [01:01:23.08] [01:01:27.05] With the director back here last year. [01:01:29.16] [01:01:31.06] By drought. [01:01:32.01] [01:01:34.04] But you know that. [01:01:35.11] [01:01:36.14] The possibility of them from yeah I know. [01:01:40.03] [01:01:41.04] And. [01:01:41.16] [01:01:42.22] All. [01:01:43.10] [01:01:47.01] But let's think about what. [01:01:49.14] [01:01:51.17] Yeah yeah they're like yeah they're I think yeah. [01:01:57.22] [01:01:59.17] Yeah. [01:02:00.23] [01:02:00.23] Yeah. [01:02:01.11] [01:02:04.21] I'll share one more than all rap because they get back to what I mention about [01:02:07.11] [01:02:07.11] localization earlier over at University of Buffalo basically said my students [01:02:11.22] [01:02:11.22] are the ones who graduate go to Silicon Valley they stay local They want to stay [01:02:15.08] [01:02:15.08] in Buffalo they're not we're not like a top recruit for Google so [01:02:19.21] [01:02:19.21] the examples he uses is he ties it to a cable a local cable company in the area [01:02:24.19] [01:02:24.19] and some of the ethical issues that come up with the software that the cable [01:02:27.18] [01:02:27.18] company deploys to all of its citizens and it becomes very personal to the students [01:02:31.10] [01:02:31.10] and they all feel it because they live in the air they've had to do with the cable [01:02:34.01] [01:02:34.01] company in some manner and then he brings them down to like that [01:02:38.05] [01:02:38.05] the software that is deployed in the in the cable company and [01:02:41.18] [01:02:41.18] I think it also is able to a lot of sense really internalize what that feels and [01:02:45.23] [01:02:45.23] looks like what I think is there really powerful and with I'm sorry over time. [01:02:49.11] [01:02:52.15] Thanks. [01:02:54.04]