Geeking Out with Adriana Villela

The One Where We Geek Out on Being a Tech Journalist with Jennifer Riggins

Episode Summary

Adriana geeks out with freelance tech journalist Jennifer Riggins about being a tech journalist, AI, diversity, equity, and inclusion (DEI), and Platform Engineering. Jennifer keeps it real on the use of AI as a job aid, policy-making surrounding the responsible use of AI, and how AI can both help lower the environmental impact of datacenters, but can itself have an environmental impact. She also reminds us that, while DEI may not have been at the forefront in 2023, it is still very much an issue in 2024 that needs to be discussed and addressed. Finally, Jennifer shares her thoughts on Platform Engineering: the importance of building platforms that software engineers will want to use, and also using Platform Engineering to define a Yellow Brick Road, providing guardrails to developers so that they can develop quickly, safely, and effectively.

Episode Notes

About our guest:

Jennifer Riggins is a culture side of tech storyteller, journalist, writer, and event and podcast host, helping to share the stories where culture and technology collide and to translate the impact of the tech we are building. She has been a working writer since 2003, and is currently based in London.

Find our guest on:

Find us on:

Show Links:

Additional Links:

Transcript:

ADRIANA: Hey, y'all, welcome to Geeking Out, the podcast about all geeky aspects of software delivery. DevOps, Observability, reliability, and everything in between. I'm your host Adriana Villela. Coming to you from Toronto, Canada, and geeking out with me today is Jennifer Riggins. Welcome, Jennifer.

JENNIFER: Hi, thank you so much for having me on.

ADRIANA: I'm super excited to have you join me. And where are you calling from today?

JENNIFER: London.

ADRIANA: Awesome. What I'll do is we'll start with some lightning round questions and then I'll get you to talk a little bit about yourself and then we'll go from there. Sound good?

JENNIFER: Great, yeah, sure.

ADRIANA: All right, let's do this. Okay, first question. Are you a lefty or a righty?

JENNIFER: Righty.

ADRIANA: All right, do you prefer iPhone or Android?

JENNIFER: iPhone. Just because it's what I have and it's seamless. It's not a moral choice, but it's a convenience choice.

ADRIANA: That's fair. Next question. Do you prefer Mac, Linux or Windows?

JENNIFER: Mac. Same. Convenience.

ADRIANA: Convenience is always very important. Okay, next question. As a tech journalist, do you lean towards Dev or Ops?

JENNIFER: Oh, Dev. Well, no, that's hard. No, I would say either side. Yeah, because Platform Engineering is all about bridging that gap, isn't it?

ADRIANA: Yeah, that's very true. Exactly. Okay, next question. Do you prefer to consume content through video or text?

JENNIFER: Text for sure. Or audio more than anything. Podcast.

ADRIANA: Yeah, I love me a good podcast. I have like way too many in my queue that I have to get through. Okay, final question. What is your superpower?

JENNIFER: Connecting people, introducing different people that can help people figure out their next step or their next job or people should just know. People. Yeah.

ADRIANA: Awesome. I love that. I think it's so important. I think people really underestimate the power of connection. All right, so we are on to the main event, the meaty bits, if you will. So why don't you share with our audience what you do with TheNewStack?

JENNIFER: Okay. I have been a working writer since uni. I am not a trained journalist. I went for political science and I've been in the tech niche for 12 or 13 years. That includes both the marketing side and journalism side. I'm just a naturally good writer and good at explaining complex topics so that everyone understands, which is good because I'm geek by association, I am nerdy by nature, but I am not technical. So it helps me then help other people understand because everyone should be involved in understanding the future and how it's being built, especially as it gets more pervasive in our bodies. In our homes and our cars and then AI thinking for our behalf, on our behalf, et cetera.

JENNIFER: And I have been writing for various as a freelancer, but with The New Stack for over eight years now, so pretty much their first year. And also I write for LeadDev and other blogs and then have software customers, things like that, helping them do their case studies or explain. I am not interested in funding, not interested in who's appointed CEO, not interested in crypto, not interested in technology precisely. I'm much more interested in the cultural impact of technology and what it's done. So I won't typically write about a new feature unless something extraordinary is about it. But I will write about once that feature is used and how it impacts people's lives, or more feature-driven like thought leadership, things like that.

ADRIANA: Cool. That's awesome. So you mentioned that in university you did not come from a journalism background. So how did you find yourself writing for a living? Like you said, it came naturally. What gave you the first opportunity?

JENNIFER: I've always been a natural writer, but I'm good at writing in that side. "Soy de letras," as you would say in Spanish. Math is how you would say it in English. And I was actually editor of my school newspaper and all, at university, so I was always involved in some way in writing and in helping other people write better things like that. So it's just a natural thing for me. I've always been able to fall back on writing.

ADRIANA: And then how did you find yourself, like writing about technology then?

JENNIFER: What else is there to write about? I think role was through Elance, or whatever it's called now. One of those Upwork, one of those freelance websites, and from there it spiraled. Something I'm good at explaining complicated concepts.

ADRIANA: I think there's not enough emphasis on really being able to distill things in a very approachable manner, right? Especially a lot of docs out there, technical docs are so.

JENNIFER: Complicated and incomplete at the same time.

I think it's the most important thing. Critical thinking and being able to talk across that chasm or chasm between technology and business will be the greatest skill set and is so important, especially in this time of AI, because you need to be able to distinguish the bullshit that the AI we know is giving what, 52% of code generated by Chat GPT is wrong, but Chat GPT is very convincing because it was trained by tech bros, which have great sense of confidence and to sell bullshit. So it doesn't have to tell you when it's wrong. So in this time when we're entering AI and all this productivity mentality and everything, we need to be able to understand, be suspicious of what is working or not. And we also need to understand the business impact. So either side of it, whether it's business needing to understand that wildly expensive cost center of engineering and cloud, or engineering being able to explain and feel connected to that business impact and to understand, so everyone's going to have to explain to themselves. And Kelsey Hightower said at Civo Navigate, an event...he said, we have this weird, maybe it's a corporate throwback, where in tech we're like, I have this great idea, but I'm not done my slides yet, I'm not done my PowerPoint presentation yet. We'll wait to talk about it.

But that's not how things work. People are storytellers. People need to be able to have conversations, even if it's expressing yourself in writing. I don't think it's necessarily very inclusive at all that everyone has to speak on stage or speak, but one-on-one conversations is still going to be a very important thing. And being able to write, even in Slack and be concise, so that's not my strong suit because I write very long features and things like that. But being able to express yourself in a way that everyone understands, because especially with AI, as we get into this interstitial age of prompt engineering, the next maybe two years, it's going to be the subject matter experts that are really important. So you won't need necessarily for everything, a coder. But if it's like building management or security in a building, maybe you need someone that actually has experience in that, who can work and partner with the developer to build something that's actually useful in AI.

ADRIANA: Yeah.

JENNIFER: So they need to talk to each other. And the people that may be deciding, especially with a chat bot, customer support and all, may have zero coding capabilities. So you need to be able to talk and communicate with them. And that's where the benefit from AI will come about. And it's honestly where we're going right now.

ADRIANA: Yeah, I think the interesting thing is AI, in a way, keeps us on our toes because you almost have to be smarter than the AI to be able to pick out the bullshit, right? Because the minute you start trusting the AI and what it produces, that's what gets you in trouble, right?

JENNIFER: Absolutely. And it's just different. We forget Chat GPT specifically is a large research project. It's not a tool. You are part of a research project. The tool is when you pay for like a private version of any of the AI tools that are trained on your context, your documentation, your processes. That's where the value comes. So if it's free, you should probably distrust it.

JENNIFER: And also think about how bad that is for the earth.

ADRIANA: Yeah, absolutely. I totally agree. Now, on the same vein of Chat GPT, I've heard initiatives from various companies where they want to replace a chunk of their written content with AI-generated content. What are your thoughts around that?

JENNIFER: Okay, so in the world of documentation and things, I think it's very interesting. I think that is...documentation writers are super important, but there's also a lot of companies relying on developers to create docs. And in the 12, 13 years I've been in the industry, I started out a lot in the API space. Number one complaint was that there was not enough documentation. Yes, the number one thing developers don't want to do is write documentation. So having documentation embedded next to the code and somewhat AI-generated I think is very valuable. Human-generated media, things like that. There was a rumor 95% of media will be generated by AI by 2025 and all.

I think we're having a real backlash about that. I know AI can't do what I can do, and I don't use it that much. I don't really use it. But my understanding, when other people use it and all, it's for the low value content. Have a proper conversation with someone to distill from someone that maybe isn't as easily expressing themselves because maybe they've got a very technical mindset. It can't have that conversation and draw out of them the true value of their product and then translate it?

ADRIANA: Yeah.

JENNIFER: Could it be useful if someone wrote an article themselves and then wanted to from that article spew out a bunch of social posts or something? It could probably be very interesting for that. Just very suspicious and controlling. You have to be anyway. But when you go through all of that, I don't feel my job is going to be in trouble. The people whose jobs are going to be in trouble are people whose lives live in Excel. Things that can and should be automated. The point is that we work on real problems. Boring, low level-coding problems will be automated, like repetitions.

Creative work should get more creative, more problem solving. But then the boring stuff, I don't know what I could automate. I'd love to automate. Like invoicing, because I tend to procrastinate that because again, soy des letras. I'm not good at math, but then I don't trust the systems to throw that private information in there.

ADRIANA: Yeah.

JENNIFER: Also, we cannot forget that there's this unbelievable inequality that's being caused by data centers. It is causing a huge environmental impact. In west London alone, affordable housing cannot be built. There can be no new affordable housing in one of the largest cities in the world, one of the alpha cities, because too much power is being taken by data centers.

ADRIANA: Wow.

JENNIFER: To cool them down, et cetera. They're super polluting. Like, it's really bad. Note that I said affordable housing. So rich people who are leaving these plots empty and funneling money, because London's like a huge money laundering area, those are still being built and left empty. But people that truly need homes cannot get homes in west London because, specifically data center power. So I think we need to think about how we're impacting the environment. There's very interesting things going on for FinOps and optimizing your Kubernetes clusters, not getting in this habit of being double the amount of cloud just in case, but having things.

And this is where AI is very interesting too, because AI can be a solution to help. It's always better to have the tool manage it than a human manage that, because if a human is responsible, they're always going to give more, just in case. They'll never give less, but they'll always more. So that's where AI can be a solution or part of the solution. But we should be putting far more pressure on anything we're paying for. We should be putting pressure as a customer that they are putting on data centers that are sustainable.

ADRIANA: Yeah, I think we have to sort of move away from this mentality, as you alluded to earlier, of just more and more and more throw more at it, because it's like infinite resources. First of all, it costs money. If that doesn't deter you, which it should, then think about the environmental impact, which is just absolutely mind blowing.

JENNIFER: And then that leads to another impact that disproportionately negatively affects people from underrepresented groups. Whether it's pollution in Virginia, which has a very underprivileged community, very impoverished community in Virginia that are directly...have hearing problems, have asthma problems, these are all problems. So yeah, I think we need to consider, in everything we do as tech storytellers, we need to consider the implication beyond the stereotypical developer, but we need to help them think about who will most likely be harmed by this and who will be more likely to be excluded or what being near.

ADRIANA: Yeah, I completely agree. When you're writing an article, what inspires you? How do you decide what to write about?

JENNIFER: It's 50/50 now because I've been writing so much about developer productivity and Platform Engineering, and, before DEI, but no one cares in 2023 about DEI. See the numbers. Sadly, diversity, equity, and inclusion is not a priority, so you have to do it surreptitiously, like by who you interview and stuff. Can't just write directly about it. I get reached out to a lot. I also see people's talks or use LinkedIn a lot. So there's all that.

ADRIANA: And then the other thing I want to ask. You said that you do a lot of writing on Platform Engineering. What got you interested in Platform Engineering in the first place?

JENNIFER: Oh, it's really a simplistic thing. I've been writing about and working in the Agile and DevOps space for a really long time. I write about culture side of tech, and like I said, in 2023, I see it in the data, I see it in traffic and all. Tech isn't even trying to pretend they care about diversity, equity and inclusion anymore. But you know what? Look at it while women, and that's probably the most privileged, minority or minoritized group in tech. While women make up about between 22 and 24% of the industry, there were 69% of layoffs. Black startups are not getting funding. I mean, it went from abysmal to 0.0002 abysmal percentage.

ADRIANA: Wow.

JENNIFER: People like Elon Musk and DHH from BaseCamp, they've made it cool publicly to not give a fuck about diversity, equity, inclusion. That means before it was informative...sorry...that means, before it was performative, but now they're not even trying to be performative. So there's that. And there's been a ton of cuts and layoffs. I see those cuts because there's two things. There's the last hired, first fired. So if they only started caring about diversity in the last two years, well, those people are going to be first cut. They also tend to be in roles like DEI, which were cut across the board.

Accessibility cut across the board. Marketing, at least perennially, is cut when there's cutbacks, but tend to be more people from minoritized groups. But on the other hand, what's 2023 been about? A lot about tech layoffs, which means a lot of trying to do more with less. And then on top of that, the code is just getting more and more complex. The cognitive load is more and more extreme. And I think while we...we, not me.

But the tech industry in general, doesn't seem to care about diversity, equity, inclusion, accessibility as much anymore, sadly, it does still understand, and I don't know that we can go back to, they've tried to return to office so many times and guess what? People are not happy, they're not productive, they're going to leave. Yes, the hand is more of an employer's market, but is still an employee's market across the board. And there's all these things where companies are realizing what statistics and data and journalism has said for years, that happy workers are more productive. And that doesn't mean massages and ping pong tables or foosball tables. That means actually finding purpose in your work, having visibility, not having even logically, from a nutty corporate standpoint, not having so many distractions and all the meetings blew up. So there's all of that. So there's this push for developer productivity because budgets are tighter, people need to make more money, staff is still bigger than it was a year, maybe two years ago. There was this irresponsible, cannibalistic growth for a while there, and it's kind of a correction, but the code has grown in the meantime too.

The cloud native landscape is obscenely complex. So there's this idea we need to work on developer productivity, which is where Platform Engineering comes in. Instead of being a platform that we've had for... since codes exist. Like Cisco was making platforms back in the '70s. It was, you do this, you control this, which for some security stuff is not a bad idea for role-based access control and all that should not be optional. But the majority of the idea of Platform Engineering is that your customers are your developers and you are building a platform as a product where you are getting feedback from them constantly and you're building just what they need to get better. And then also it comes back to that whole docs problem. What is a huge problem? Who is breaking that developer flow, that getting in the zone is not being able to find things, googling it, going to Stack Overflow, asking a question on Reddit. Instead you've got this...we haven't even mentioned Copilot yet, but I think that for the developer audience has the most potential, because it's in with where 85% of repos are...in GitHub. So it's about them not context-switching as much and meetings actually having value, not having Agile.

And then Covid just led to this multiplication of meetings for meeting. So Abby Bangser from Syntasso has my favorite definition of what Platform Engineering is, which it's almost like a physical platform you're supporting people on that takes care of the not differential but not unimportant work. So with DevOps, we went through this idea that you build, you test, you maintain, you do all of that, all the way to the cloud, all the way to release and all. But cloud is not differential to the average programmer, specifically to their audience, which would tend to be external users or customers. Security, very important, not differential testing. Very important, not differential repetitive work. Now it just should just be automated. So it doesn't matter anyway.

And it's about...Spotify calls it Golden Pathway. I like calling it the Yellow Brick Road because if your developers wander off, they may go in a poppy field and go down a Reddit rabbit hole. But if Dorothy and them had stayed on the Yellow Brick Road, they would have been a lot faster. If Gandalf had given the eagles from the start, the book would have been a lot shorter. So why don't we do that? Guess what? If you had asked what Frodo would like? Oh, that's a new nerdy euphemism I'm coming up with right now, metaphor. But I think it works. Would have been a lot shorter movie, a lot shorter movie series, book series, and probably a lot more people wouldn't have died.

So just ask your developers what is frustrating them and then start there.

ADRIANA: Yeah, exactly. And there are so many things that frustrate developers.

JENNIFER: And [inaudible] and searchability are always at the top of that list. They want to know who does what in a company, which again, comes down to collaboration and knowing people across the business. It's a positive thing to learn.

ADRIANA: Yeah, absolutely. And there's another one. I think it came about from a question that you asked on one of the socials, which was something around, what are some of the developer frustrations? And I was thinking back to so many jobs where I started off...and onboarding and setting up a new environment on your machine is like the most fucking irritating experience ever. It's like, why do we have to keep doing the same thing over and over and over again? Why don't we have a streamlined process for setting up our dev environments when we start a new job?

JENNIFER: Why would. Yeah, why would you even need to, why is setting up an environment useful for you to be doing? It's not helping the customer, it's not driving value. So Spotify, being like one of know, they created Backstage and outsourced it because they thought it was that important to standardize it in the community, which I like. But by them using Backstage, they got their developer onboarding time, which I believe they count as ten pull requests. Like that is when you consider productive. They went from 110 days to 20 days, pull requests because you just get people up and running. You give them what they need. You wouldn't give them a laptop and have them install Windows or install Linux or install whatever you want on your laptop. Give them the tool.

ADRIANA: Yeah.

JENNIFER: So just do that for all of the cloud because, and then you still give them the option. There will still be your 5% that want to engineer their way around a problem. And that's why you build it with APIs and you let people do their own thing. But maybe you don't need to support their work. They're at their own risk. They're on that poppy field, they're doing their own thing. But you'll support that 95% and that's okay.

ADRIANA: Yeah. I really love your analogy of the Yellow Brick Road, because it really is all about like, these are your guardrails. It's there to protect you from yourself. Because we like to deviate. Sometimes we're not necessarily aware that that's not a great thing to do.

JENNIFER: And you can still deviate. That's why you, as a Platform Engineer have to make something they want to use. And again, it comes all the way back to that tech storytelling, those early wins, the examples. Just the proof of good work is you need to make something they want to use. And then you have your customers who happen to be internal, probably more annoying, but you have a much tighter feedback loop. So you're going to get more direct feedback all the time. It's a good thing. It can just be probably a bit awkward for some people.

Also, there's the problem that Platform Engineers are engineers, so they think they know best, which is not the point. And you just build something that they want to use, make it easy for them to stay on the path. So even the guardrails, I picture that car cannot really go past those guardrails. Follow the lines.

ADRIANA: Yeah, it's like this is the path with some flexibility in mind, but you only have...

JENNIFER: Fall off the cliff, and that is all you.

ADRIANA: I think that's a perfect analogy. I love that. And the final thing that I wanted to touch upon, and you brought it up a few times, and I think it's actually a very important subject, which is DEI, which, as you pointed out, is the conversation around it has changed a lot, but the problem still remains. And it's kind of interesting because...

I've had a number of conversations with people over the years, and after you pointed it out, I'm like, yeah, I guess it's kind of unfashionable to like, oh, let's have the panels of underrepresented groups talking about being underrepresented. Then it's like, well, as you said, we have to do it in a sneaky manner. But I think we do have to call it out for what it is because you go to tech conferences and I was a speaker at Observability Day, the co-located event for KubeCon North America, and there were three of us female speakers for all of Observability Day. And I was like, what the hell?

JENNIFER: Could probably guess two of them just by knowing the handful of females or women that have access to that space and who are doing amazing work. But yeah, we don't need VIP bathrooms at tech events, we need representation. It's the only time we would be very happy to queue at bathrooms. Please, tech events.

But like anything in the. When we're talking about open source, 3% out of what, 20 speakers or something for co-located day, it's actually not a bad percentage for open source because open source around 4% women and non-binary because it's toxic, because it's based on free work, which we do the brunt of anyway.

ADRIANA: So true.

JENNIFER: Women and people of color are far more likely to be doing free voluntary work and they don't have time for it. But then you lose the benefits of public code samples, of working with companies that actually are really big companies, like a Google or a Spotify or Atlassian, all these companies that support a lot of open source or access Amazon Web Services. These are companies that provide a lot of open source. But then if you can't go to these events, you can't work on these projects because you can't do free work. Open source is a huge problem. So it's always going to be worse. Which open source should I believe that open source should be free code, but I don't think believe in free labor, and I think that's a huge problem.

ADRIANA: Yeah, absolutely.

JENNIFER: You are a company benefiting from an open source project. You should be investing.

Either find a way to sponsor that project or hire a staffer that contributes to that project as their deal, as their job, and just also focused on both technical and nontechnical contributions. Because again, we're back to documentation, we're back to the other big barrier to entry in open source diversity is that everything's in English. So you need people translate. Another use case that in probably 18 months will be very valuable from AI.

ADRIANA: Yes, we take it for granted that we're English speakers, so we're like, yeah, of course, no problem. But I do remember, I think it was someone at KubeCon who was saying that they felt so shy about contributing to stuff because English wasn't their native language and they know incredibly smart, but they just didn't feel confident contributing to open source. And it just. Oh, my God.

JENNIFER: Even in other languages, you need to know English too, to be a translator because it's the de facto language to translate to. But for example, Kubernetes, which Divya Mohan runs with someone else. I forget their name, sorry, but has organized for years the documentation translation, and it's across like 18 languages, or will be soon. Zero are in Africa. Are African languages zero?

ADRIANA: Oh, wow.

JENNIFER: Only about 2%, maybe 3%, depending on what you see of open source contributors and users are from Africa, which is about 19% of the world population and likely the geographic area that would most benefit from free and open and secure software, because typically open source is also more secure, more eyeballs, more people involved, et cetera. So it would benefit everyone, like, at an exponential GDP level, but because it's just in English...

ADRIANA: Yeah. And it occurs to me also that even our programming languages...the syntax is in English!

JENNIFER: And doesn't seem like that's going to change. Yeah, no, that is where AI, I think, will be interesting.

ADRIANA: Yeah, it'll be definitely very interesting to see where it goes. Now, as we wrap things up, do you have any final thoughts on where you see this industry, our tech industry, going in the next, say, year?

JENNIFER: That's it. It's a year, year and a half tops, because we're in this transition period where AI is still nascent, but it will very quickly advance and it will be much more useful because it will be context-specific, and I hope it won't be companies like...Telephonica in Spain fired, like, a huge chunk of its customer support reps because it's like, we can just use a chat AI. It's not great. I'm an HSBC customer, and I'm always like, give me human, give me human.

ADRIANA: Yes.

JENNIFER: It's not working. The Moby whatever, the chat bot thing, they. It's. It's not for me. I know a lot of people would rather talk to a bot, definitely, than stay on hold, but it's just not there yet. So we need humans in the loop now more than ever who have that subject matter expertise. We're not there yet, but we then need real humans in the loop feeding back into the AI, whatever it is, explaining to it, because people are still really nascent. But that's also part of the problem.

A lot of companies...this was in my Spanish class. If I started taking Spanish class for the first time, at the YMCA. And that was our topic, Chat GPT. And I'm like, no, I don't use it. Other people are like, "Yeah, I use it for this and this." But then the Spanish teacher who's quite...kind of identifies as a Luddite, he says he pays for Chat GPT because then he gets the license, then he gets the right to his own content that he could one day sell. And I was like, "I didn't think about that." I thought about it more because a lot of companies don't have generative AI policies yet, which is ridiculous.

Look what happened to Samsung. We're recording this in early December, I think in September, a coder didn't think about it and checked like a whole code base live in the public, free Chat GPT feeding like a bunch of private information in. And now Samsung's like, no more, no more generative AI, we're done.

ADRIANA: Yeah.

JENNIFER: [inaudible] behind, instead of every company needs like law firms. People are using it for stuff at consultancies. But if you don't tell people, like, do not put public information in here, do not put IP in here, or just pay the $20 a month for Chat GPT. I think it's five a month for Copilot and it's just a much better experience anyway. So pay for your tools and advise people how to use them. So I think just super important because I just think it's clear that AI is just going to be a part of our lives.

ADRIANA: It is, yeah. And we have to be more mindful of how we're integrating it in our lives.

JENNIFER: Because what is it? Copilot went GA early June [2023]. It's early December now...maybe mid June. By the time of the Octoverse Report, which I think was early November, late October, 92% of developers in the US were using generative AI.

ADRIANA: Damn.

JENNIFER: We're testing out. Like you can't take this away. They are finding value from, yeah, you can't take this away anymore, but you really have to have a policy. And it's shocking how few do in California or GDPR in Europe. I'm shocked we haven't had a big problem. I'm shocked it hasn't been big yet.

ADRIANA: Yeah, it's been sort of...as companies realize that it's important, they'll implement it into their policies, but there's like, no...

JENNIFER: [inaudible] And putting really wild stuff. I have someone I know in the journalist space who is much more technologically advanced than I am and not a native English speaker. So they had put a very nascent new technology...had written like a really deep dive article, evaluating it, explaining tutorial. They had thrown it into public Chat GPT to clean it up. Then they delivered the client. Three weeks later, their exact article showed up on one of those clickbait sites.

ADRIANA: Oh my God.

JENNIFER: They can't contact an editor, because...they can't contact a human being, because it's a fake human being, because it's like a clickbait site. But that site had found that this new technology was trending and they trained that site in it. They trained Chat GPT in it. And then it just took out their article.

ADRIANA: Damn.

JENNIFER: Don't put stuff that's not published or public in a public AI, whether Bard, it's Bing, whether it's Chat GPT, you don't know what's going to happen. Pay for it. If you want to play around with it, maybe. But even playing for fun, it still has an environmental impact that no one seems to care about.

ADRIANA: Yeah, I'm so glad that you're bringing that up, because the more we talk about it, I hope the more it gets into people's brains that we cannot take for granted the things that we use. I mean, even Google, right? The fact that you're googling stuff, I mean, there are servers running things somewhere.

JENNIFER: Google tends towards green energy more than the largest one, AWS. Leslie Miley, who was speaking as himself, but does work at Microsoft, at QCon, gave this wonderful in his keynote, just a really impactful talk. And he analogized the growth in AI to the US and maybe one of the world's largest infrastructure projects, which was the interstate road system, which specifically created red lines, which specifically was like, strategically kept people of color from being able to use buses to enter New York City and work, which still to this day in San Francisco or that area, the Bay Area, where we have all this, I assume is the most inequitable place in the world, where kids are three times more likely to have asthma, severe asthma, by six years old because of where these roads were built. So this idea, and it's happening again with the access to electricity, the access to data, the pollution, the access to clean water, because that's what's used...water is being used to cool data centers and it's happening around the same lines and stuff. It has this ability to create this great inequity and without diverse people and thought on your teams, people aren't considering it. And we know, again, one of those statistics, just like happy developers are more productive ones, more diverse teams are more innovative and profitable, but we've got our masks over our eyes again and not thinking. And that's where we are.

So sorry to end on a bummer of a note, but let's think of the...I'm always back to there's a wonderful, Agile practice called Consequence Scanning from Emily Webber and Sam Brown. And I just recommend just doing a consequence scanning sometimes. Thinking about it's just simple questions like if this scaled, who wouldn't be able to use it? What are the good intentions we weren't thinking about? And what are some negative intentions or consequences that could happen because of this tool? This is one of those things with open source that even more because if you're being truly open source, your code could be used, I don't know, making another Kiwi Farms or another hate site. Hate farm, that's the consequence of open source. You need to think early on, "Okay, what if someone used this for evil?"

ADRIANA: Yeah.

JENNIFER: Negative consequences or what are the environmental consequences?

ADRIANA: Absolutely. And I think that's really great food for thought. And I hope folks who are listening to this really take this to heart. And next time they use a tool like Chat GPT, they think about the environmental impact or even when they're using resources on the cloud, think about these things because it's so important and we've only got the one planet and time is ticking.

JENNIFER: And don't trust the news. Like, these jobs like mine as a tech storyteller are not going away. We need more people. We need more people explaining in different ways, in different languages and different jargon so everyone understands what is being built and why and what the consequences are. Because a lot of people are just using.

ADRIANA: Yeah, absolutely. Well, thank you so much, Jennifer, for geeking out with me today. Y'all don't forget to subscribe and be sure to check the show notes for additional resources and to connect with us and our guests on social media. Until next time...

JENNIFER: Peace out and geek out, y'all.

ADRIANA: Geeking Out is hosted and produced by me, Adriana Vilella. I also compose and perform the theme music on my trusty clarinet. Geeking Out is also produced by my daughter, Hannah Maxwell, who incidentally designed all of the cool graphics. Be sure to follow us on all the socials by going to bento.me/geekingout.

Episode Transcription

ADRIANA: Hey, y'all, welcome to Geeking Out, the podcast about all geeky aspects of software delivery. DevOps, Observability, reliability, and everything in between. I'm your host Adriana Villela. Coming to you from Toronto, Canada, and geeking out with me today is Jennifer Riggins. Welcome, Jennifer.

JENNIFER: Hi, thank you so much for having me on.

ADRIANA: I'm super excited to have you join me. And where are you calling from today?

JENNIFER: London.

ADRIANA: Awesome. What I'll do is we'll start with some lightning round questions and then I'll get you to talk a little bit about yourself and then we'll go from there. Sound good?

JENNIFER: Great, yeah, sure.

ADRIANA: All right, let's do this. Okay, first question. Are you a lefty or a righty?

JENNIFER: Righty.

ADRIANA: All right, do you prefer iPhone or Android?

JENNIFER: iPhone. Just because it's what I have and it's seamless. It's not a moral choice, but it's a convenience choice.

ADRIANA: That's fair. Next question. Do you prefer Mac, Linux or Windows?

JENNIFER: Mac. Same. Convenience.

ADRIANA: Convenience is always very important. Okay, next question. As a tech journalist, do you lean towards Dev or Ops?

JENNIFER: Oh, Dev. Well, no, that's hard. No, I would say either side. Yeah, because Platform Engineering is all about bridging that gap, isn't it?

ADRIANA: Yeah, that's very true. Exactly. Okay, next question. Do you prefer to consume content through video or text?

JENNIFER: Text for sure. Or audio more than anything. Podcast.

ADRIANA: Yeah, I love me a good podcast. I have like way too many in my queue that I have to get through. Okay, final question. What is your superpower?

JENNIFER: Connecting people, introducing different people that can help people figure out their next step or their next job or people should just know. People. Yeah.

ADRIANA: Awesome. I love that. I think it's so important. I think people really underestimate the power of connection. All right, so we are on to the main event, the meaty bits, if you will. So why don't you share with our audience what you do with TheNewStack?

JENNIFER: Okay. I have been a working writer since uni. I am not a trained journalist. I went for political science and I've been in the tech niche for 12 or 13 years. That includes both the marketing side and journalism side. I'm just a naturally good writer and good at explaining complex topics so that everyone understands, which is good because I'm geek by association, I am nerdy by nature, but I am not technical. So it helps me then help other people understand because everyone should be involved in understanding the future and how it's being built, especially as it gets more pervasive in our bodies. In our homes and our cars and then AI thinking for our behalf, on our behalf, et cetera.

JENNIFER: And I have been writing for various as a freelancer, but with The New Stack for over eight years now, so pretty much their first year. And also I write for LeadDev and other blogs and then have software customers, things like that, helping them do their case studies or explain. I am not interested in funding, not interested in who's appointed CEO, not interested in crypto, not interested in technology precisely. I'm much more interested in the cultural impact of technology and what it's done. So I won't typically write about a new feature unless something extraordinary is about it. But I will write about once that feature is used and how it impacts people's lives, or more feature-driven like thought leadership, things like that.

ADRIANA: Cool. That's awesome. So you mentioned that in university you did not come from a journalism background. So how did you find yourself writing for a living? Like you said, it came naturally. What gave you the first opportunity?

JENNIFER: I've always been a natural writer, but I'm good at writing in that side. "Soy de letras," as you would say in Spanish. Math is how you would say it in English. And I was actually editor of my school newspaper and all, at university, so I was always involved in some way in writing and in helping other people write better things like that. So it's just a natural thing for me. I've always been able to fall back on writing.

ADRIANA: And then how did you find yourself, like writing about technology then?

JENNIFER: What else is there to write about? I think role was through Elance, or whatever it's called now. One of those Upwork, one of those freelance websites, and from there it spiraled. Something I'm good at explaining complicated concepts.

ADRIANA: I think there's not enough emphasis on really being able to distill things in a very approachable manner, right? Especially a lot of docs out there, technical docs are so.

JENNIFER: Complicated and incomplete at the same time.

I think it's the most important thing. Critical thinking and being able to talk across that chasm or chasm between technology and business will be the greatest skill set and is so important, especially in this time of AI, because you need to be able to distinguish the bullshit that the AI we know is giving what, 52% of code generated by Chat GPT is wrong, but Chat GPT is very convincing because it was trained by tech bros, which have great sense of confidence and to sell bullshit. So it doesn't have to tell you when it's wrong. So in this time when we're entering AI and all this productivity mentality and everything, we need to be able to understand, be suspicious of what is working or not. And we also need to understand the business impact. So either side of it, whether it's business needing to understand that wildly expensive cost center of engineering and cloud, or engineering being able to explain and feel connected to that business impact and to understand, so everyone's going to have to explain to themselves. And Kelsey Hightower said at Civo Navigate, an event...he said, we have this weird, maybe it's a corporate throwback, where in tech we're like, I have this great idea, but I'm not done my slides yet, I'm not done my PowerPoint presentation yet. We'll wait to talk about it.

But that's not how things work. People are storytellers. People need to be able to have conversations, even if it's expressing yourself in writing. I don't think it's necessarily very inclusive at all that everyone has to speak on stage or speak, but one-on-one conversations is still going to be a very important thing. And being able to write, even in Slack and be concise, so that's not my strong suit because I write very long features and things like that. But being able to express yourself in a way that everyone understands, because especially with AI, as we get into this interstitial age of prompt engineering, the next maybe two years, it's going to be the subject matter experts that are really important. So you won't need necessarily for everything, a coder. But if it's like building management or security in a building, maybe you need someone that actually has experience in that, who can work and partner with the developer to build something that's actually useful in AI.

ADRIANA: Yeah.

JENNIFER: So they need to talk to each other. And the people that may be deciding, especially with a chat bot, customer support and all, may have zero coding capabilities. So you need to be able to talk and communicate with them. And that's where the benefit from AI will come about. And it's honestly where we're going right now.

ADRIANA: Yeah, I think the interesting thing is AI, in a way, keeps us on our toes because you almost have to be smarter than the AI to be able to pick out the bullshit, right? Because the minute you start trusting the AI and what it produces, that's what gets you in trouble, right?

JENNIFER: Absolutely. And it's just different. We forget Chat GPT specifically is a large research project. It's not a tool. You are part of a research project. The tool is when you pay for like a private version of any of the AI tools that are trained on your context, your documentation, your processes. That's where the value comes. So if it's free, you should probably distrust it.

JENNIFER: And also think about how bad that is for the earth.

ADRIANA: Yeah, absolutely. I totally agree. Now, on the same vein of Chat GPT, I've heard initiatives from various companies where they want to replace a chunk of their written content with AI-generated content. What are your thoughts around that?

JENNIFER: Okay, so in the world of documentation and things, I think it's very interesting. I think that is...documentation writers are super important, but there's also a lot of companies relying on developers to create docs. And in the 12, 13 years I've been in the industry, I started out a lot in the API space. Number one complaint was that there was not enough documentation. Yes, the number one thing developers don't want to do is write documentation. So having documentation embedded next to the code and somewhat AI-generated I think is very valuable. Human-generated media, things like that. There was a rumor 95% of media will be generated by AI by 2025 and all.

I think we're having a real backlash about that. I know AI can't do what I can do, and I don't use it that much. I don't really use it. But my understanding, when other people use it and all, it's for the low value content. Have a proper conversation with someone to distill from someone that maybe isn't as easily expressing themselves because maybe they've got a very technical mindset. It can't have that conversation and draw out of them the true value of their product and then translate it?

ADRIANA: Yeah.

JENNIFER: Could it be useful if someone wrote an article themselves and then wanted to from that article spew out a bunch of social posts or something? It could probably be very interesting for that. Just very suspicious and controlling. You have to be anyway. But when you go through all of that, I don't feel my job is going to be in trouble. The people whose jobs are going to be in trouble are people whose lives live in Excel. Things that can and should be automated. The point is that we work on real problems. Boring, low level-coding problems will be automated, like repetitions.

Creative work should get more creative, more problem solving. But then the boring stuff, I don't know what I could automate. I'd love to automate. Like invoicing, because I tend to procrastinate that because again, soy des letras. I'm not good at math, but then I don't trust the systems to throw that private information in there.

ADRIANA: Yeah.

JENNIFER: Also, we cannot forget that there's this unbelievable inequality that's being caused by data centers. It is causing a huge environmental impact. In west London alone, affordable housing cannot be built. There can be no new affordable housing in one of the largest cities in the world, one of the alpha cities, because too much power is being taken by data centers.

ADRIANA: Wow.

JENNIFER: To cool them down, et cetera. They're super polluting. Like, it's really bad. Note that I said affordable housing. So rich people who are leaving these plots empty and funneling money, because London's like a huge money laundering area, those are still being built and left empty. But people that truly need homes cannot get homes in west London because, specifically data center power. So I think we need to think about how we're impacting the environment. There's very interesting things going on for FinOps and optimizing your Kubernetes clusters, not getting in this habit of being double the amount of cloud just in case, but having things.

And this is where AI is very interesting too, because AI can be a solution to help. It's always better to have the tool manage it than a human manage that, because if a human is responsible, they're always going to give more, just in case. They'll never give less, but they'll always more. So that's where AI can be a solution or part of the solution. But we should be putting far more pressure on anything we're paying for. We should be putting pressure as a customer that they are putting on data centers that are sustainable.

ADRIANA: Yeah, I think we have to sort of move away from this mentality, as you alluded to earlier, of just more and more and more throw more at it, because it's like infinite resources. First of all, it costs money. If that doesn't deter you, which it should, then think about the environmental impact, which is just absolutely mind blowing.

JENNIFER: And then that leads to another impact that disproportionately negatively affects people from underrepresented groups. Whether it's pollution in Virginia, which has a very underprivileged community, very impoverished community in Virginia that are directly...have hearing problems, have asthma problems, these are all problems. So yeah, I think we need to consider, in everything we do as tech storytellers, we need to consider the implication beyond the stereotypical developer, but we need to help them think about who will most likely be harmed by this and who will be more likely to be excluded or what being near.

ADRIANA: Yeah, I completely agree. When you're writing an article, what inspires you? How do you decide what to write about?

JENNIFER: It's 50/50 now because I've been writing so much about developer productivity and Platform Engineering, and, before DEI, but no one cares in 2023 about DEI. See the numbers. Sadly, diversity, equity, and inclusion is not a priority, so you have to do it surreptitiously, like by who you interview and stuff. Can't just write directly about it. I get reached out to a lot. I also see people's talks or use LinkedIn a lot. So there's all that.

ADRIANA: And then the other thing I want to ask. You said that you do a lot of writing on Platform Engineering. What got you interested in Platform Engineering in the first place?

JENNIFER: Oh, it's really a simplistic thing. I've been writing about and working in the Agile and DevOps space for a really long time. I write about culture side of tech, and like I said, in 2023, I see it in the data, I see it in traffic and all. Tech isn't even trying to pretend they care about diversity, equity and inclusion anymore. But you know what? Look at it while women, and that's probably the most privileged, minority or minoritized group in tech. While women make up about between 22 and 24% of the industry, there were 69% of layoffs. Black startups are not getting funding. I mean, it went from abysmal to 0.0002 abysmal percentage.

ADRIANA: Wow.

JENNIFER: People like Elon Musk and DHH from BaseCamp, they've made it cool publicly to not give a fuck about diversity, equity, inclusion. That means before it was informative...sorry...that means, before it was performative, but now they're not even trying to be performative. So there's that. And there's been a ton of cuts and layoffs. I see those cuts because there's two things. There's the last hired, first fired. So if they only started caring about diversity in the last two years, well, those people are going to be first cut. They also tend to be in roles like DEI, which were cut across the board.

Accessibility cut across the board. Marketing, at least perennially, is cut when there's cutbacks, but tend to be more people from minoritized groups. But on the other hand, what's 2023 been about? A lot about tech layoffs, which means a lot of trying to do more with less. And then on top of that, the code is just getting more and more complex. The cognitive load is more and more extreme. And I think while we...we, not me.

But the tech industry in general, doesn't seem to care about diversity, equity, inclusion, accessibility as much anymore, sadly, it does still understand, and I don't know that we can go back to, they've tried to return to office so many times and guess what? People are not happy, they're not productive, they're going to leave. Yes, the hand is more of an employer's market, but is still an employee's market across the board. And there's all these things where companies are realizing what statistics and data and journalism has said for years, that happy workers are more productive. And that doesn't mean massages and ping pong tables or foosball tables. That means actually finding purpose in your work, having visibility, not having even logically, from a nutty corporate standpoint, not having so many distractions and all the meetings blew up. So there's all of that. So there's this push for developer productivity because budgets are tighter, people need to make more money, staff is still bigger than it was a year, maybe two years ago. There was this irresponsible, cannibalistic growth for a while there, and it's kind of a correction, but the code has grown in the meantime too.

The cloud native landscape is obscenely complex. So there's this idea we need to work on developer productivity, which is where Platform Engineering comes in. Instead of being a platform that we've had for... since codes exist. Like Cisco was making platforms back in the '70s. It was, you do this, you control this, which for some security stuff is not a bad idea for role-based access control and all that should not be optional. But the majority of the idea of Platform Engineering is that your customers are your developers and you are building a platform as a product where you are getting feedback from them constantly and you're building just what they need to get better. And then also it comes back to that whole docs problem. What is a huge problem? Who is breaking that developer flow, that getting in the zone is not being able to find things, googling it, going to Stack Overflow, asking a question on Reddit. Instead you've got this...we haven't even mentioned Copilot yet, but I think that for the developer audience has the most potential, because it's in with where 85% of repos are...in GitHub. So it's about them not context-switching as much and meetings actually having value, not having Agile.

And then Covid just led to this multiplication of meetings for meeting. So Abby Bangser from Syntasso has my favorite definition of what Platform Engineering is, which it's almost like a physical platform you're supporting people on that takes care of the not differential but not unimportant work. So with DevOps, we went through this idea that you build, you test, you maintain, you do all of that, all the way to the cloud, all the way to release and all. But cloud is not differential to the average programmer, specifically to their audience, which would tend to be external users or customers. Security, very important, not differential testing. Very important, not differential repetitive work. Now it just should just be automated. So it doesn't matter anyway.

And it's about...Spotify calls it Golden Pathway. I like calling it the Yellow Brick Road because if your developers wander off, they may go in a poppy field and go down a Reddit rabbit hole. But if Dorothy and them had stayed on the Yellow Brick Road, they would have been a lot faster. If Gandalf had given the eagles from the start, the book would have been a lot shorter. So why don't we do that? Guess what? If you had asked what Frodo would like? Oh, that's a new nerdy euphemism I'm coming up with right now, metaphor. But I think it works. Would have been a lot shorter movie, a lot shorter movie series, book series, and probably a lot more people wouldn't have died.

So just ask your developers what is frustrating them and then start there.

ADRIANA: Yeah, exactly. And there are so many things that frustrate developers.

JENNIFER: And [inaudible] and searchability are always at the top of that list. They want to know who does what in a company, which again, comes down to collaboration and knowing people across the business. It's a positive thing to learn.

ADRIANA: Yeah, absolutely. And there's another one. I think it came about from a question that you asked on one of the socials, which was something around, what are some of the developer frustrations? And I was thinking back to so many jobs where I started off...and onboarding and setting up a new environment on your machine is like the most fucking irritating experience ever. It's like, why do we have to keep doing the same thing over and over and over again? Why don't we have a streamlined process for setting up our dev environments when we start a new job?

JENNIFER: Why would. Yeah, why would you even need to, why is setting up an environment useful for you to be doing? It's not helping the customer, it's not driving value. So Spotify, being like one of know, they created Backstage and outsourced it because they thought it was that important to standardize it in the community, which I like. But by them using Backstage, they got their developer onboarding time, which I believe they count as ten pull requests. Like that is when you consider productive. They went from 110 days to 20 days, pull requests because you just get people up and running. You give them what they need. You wouldn't give them a laptop and have them install Windows or install Linux or install whatever you want on your laptop. Give them the tool.

ADRIANA: Yeah.

JENNIFER: So just do that for all of the cloud because, and then you still give them the option. There will still be your 5% that want to engineer their way around a problem. And that's why you build it with APIs and you let people do their own thing. But maybe you don't need to support their work. They're at their own risk. They're on that poppy field, they're doing their own thing. But you'll support that 95% and that's okay.

ADRIANA: Yeah. I really love your analogy of the Yellow Brick Road, because it really is all about like, these are your guardrails. It's there to protect you from yourself. Because we like to deviate. Sometimes we're not necessarily aware that that's not a great thing to do.

JENNIFER: And you can still deviate. That's why you, as a Platform Engineer have to make something they want to use. And again, it comes all the way back to that tech storytelling, those early wins, the examples. Just the proof of good work is you need to make something they want to use. And then you have your customers who happen to be internal, probably more annoying, but you have a much tighter feedback loop. So you're going to get more direct feedback all the time. It's a good thing. It can just be probably a bit awkward for some people.

Also, there's the problem that Platform Engineers are engineers, so they think they know best, which is not the point. And you just build something that they want to use, make it easy for them to stay on the path. So even the guardrails, I picture that car cannot really go past those guardrails. Follow the lines.

ADRIANA: Yeah, it's like this is the path with some flexibility in mind, but you only have...

JENNIFER: Fall off the cliff, and that is all you.

ADRIANA: I think that's a perfect analogy. I love that. And the final thing that I wanted to touch upon, and you brought it up a few times, and I think it's actually a very important subject, which is DEI, which, as you pointed out, is the conversation around it has changed a lot, but the problem still remains. And it's kind of interesting because...

I've had a number of conversations with people over the years, and after you pointed it out, I'm like, yeah, I guess it's kind of unfashionable to like, oh, let's have the panels of underrepresented groups talking about being underrepresented. Then it's like, well, as you said, we have to do it in a sneaky manner. But I think we do have to call it out for what it is because you go to tech conferences and I was a speaker at Observability Day, the co-located event for KubeCon North America, and there were three of us female speakers for all of Observability Day. And I was like, what the hell?

JENNIFER: Could probably guess two of them just by knowing the handful of females or women that have access to that space and who are doing amazing work. But yeah, we don't need VIP bathrooms at tech events, we need representation. It's the only time we would be very happy to queue at bathrooms. Please, tech events.

But like anything in the. When we're talking about open source, 3% out of what, 20 speakers or something for co-located day, it's actually not a bad percentage for open source because open source around 4% women and non-binary because it's toxic, because it's based on free work, which we do the brunt of anyway.

ADRIANA: So true.

JENNIFER: Women and people of color are far more likely to be doing free voluntary work and they don't have time for it. But then you lose the benefits of public code samples, of working with companies that actually are really big companies, like a Google or a Spotify or Atlassian, all these companies that support a lot of open source or access Amazon Web Services. These are companies that provide a lot of open source. But then if you can't go to these events, you can't work on these projects because you can't do free work. Open source is a huge problem. So it's always going to be worse. Which open source should I believe that open source should be free code, but I don't think believe in free labor, and I think that's a huge problem.

ADRIANA: Yeah, absolutely.

JENNIFER: You are a company benefiting from an open source project. You should be investing.

Either find a way to sponsor that project or hire a staffer that contributes to that project as their deal, as their job, and just also focused on both technical and nontechnical contributions. Because again, we're back to documentation, we're back to the other big barrier to entry in open source diversity is that everything's in English. So you need people translate. Another use case that in probably 18 months will be very valuable from AI.

ADRIANA: Yes, we take it for granted that we're English speakers, so we're like, yeah, of course, no problem. But I do remember, I think it was someone at KubeCon who was saying that they felt so shy about contributing to stuff because English wasn't their native language and they know incredibly smart, but they just didn't feel confident contributing to open source. And it just. Oh, my God.

JENNIFER: Even in other languages, you need to know English too, to be a translator because it's the de facto language to translate to. But for example, Kubernetes, which Divya Mohan runs with someone else. I forget their name, sorry, but has organized for years the documentation translation, and it's across like 18 languages, or will be soon. Zero are in Africa. Are African languages zero?

ADRIANA: Oh, wow.

JENNIFER: Only about 2%, maybe 3%, depending on what you see of open source contributors and users are from Africa, which is about 19% of the world population and likely the geographic area that would most benefit from free and open and secure software, because typically open source is also more secure, more eyeballs, more people involved, et cetera. So it would benefit everyone, like, at an exponential GDP level, but because it's just in English...

ADRIANA: Yeah. And it occurs to me also that even our programming languages...the syntax is in English!

JENNIFER: And doesn't seem like that's going to change. Yeah, no, that is where AI, I think, will be interesting.

ADRIANA: Yeah, it'll be definitely very interesting to see where it goes. Now, as we wrap things up, do you have any final thoughts on where you see this industry, our tech industry, going in the next, say, year?

JENNIFER: That's it. It's a year, year and a half tops, because we're in this transition period where AI is still nascent, but it will very quickly advance and it will be much more useful because it will be context-specific, and I hope it won't be companies like...Telephonica in Spain fired, like, a huge chunk of its customer support reps because it's like, we can just use a chat AI. It's not great. I'm an HSBC customer, and I'm always like, give me human, give me human.

ADRIANA: Yes.

JENNIFER: It's not working. The Moby whatever, the chat bot thing, they. It's. It's not for me. I know a lot of people would rather talk to a bot, definitely, than stay on hold, but it's just not there yet. So we need humans in the loop now more than ever who have that subject matter expertise. We're not there yet, but we then need real humans in the loop feeding back into the AI, whatever it is, explaining to it, because people are still really nascent. But that's also part of the problem.

A lot of companies...this was in my Spanish class. If I started taking Spanish class for the first time, at the YMCA. And that was our topic, Chat GPT. And I'm like, no, I don't use it. Other people are like, "Yeah, I use it for this and this." But then the Spanish teacher who's quite...kind of identifies as a Luddite, he says he pays for Chat GPT because then he gets the license, then he gets the right to his own content that he could one day sell. And I was like, "I didn't think about that." I thought about it more because a lot of companies don't have generative AI policies yet, which is ridiculous.

Look what happened to Samsung. We're recording this in early December, I think in September, a coder didn't think about it and checked like a whole code base live in the public, free Chat GPT feeding like a bunch of private information in. And now Samsung's like, no more, no more generative AI, we're done.

ADRIANA: Yeah.

JENNIFER: [inaudible] behind, instead of every company needs like law firms. People are using it for stuff at consultancies. But if you don't tell people, like, do not put public information in here, do not put IP in here, or just pay the $20 a month for Chat GPT. I think it's five a month for Copilot and it's just a much better experience anyway. So pay for your tools and advise people how to use them. So I think just super important because I just think it's clear that AI is just going to be a part of our lives.

ADRIANA: It is, yeah. And we have to be more mindful of how we're integrating it in our lives.

JENNIFER: Because what is it? Copilot went GA early June [2023]. It's early December now...maybe mid June. By the time of the Octoverse Report, which I think was early November, late October, 92% of developers in the US were using generative AI.

ADRIANA: Damn.

JENNIFER: We're testing out. Like you can't take this away. They are finding value from, yeah, you can't take this away anymore, but you really have to have a policy. And it's shocking how few do in California or GDPR in Europe. I'm shocked we haven't had a big problem. I'm shocked it hasn't been big yet.

ADRIANA: Yeah, it's been sort of...as companies realize that it's important, they'll implement it into their policies, but there's like, no...

JENNIFER: [inaudible] And putting really wild stuff. I have someone I know in the journalist space who is much more technologically advanced than I am and not a native English speaker. So they had put a very nascent new technology...had written like a really deep dive article, evaluating it, explaining tutorial. They had thrown it into public Chat GPT to clean it up. Then they delivered the client. Three weeks later, their exact article showed up on one of those clickbait sites.

ADRIANA: Oh my God.

JENNIFER: They can't contact an editor, because...they can't contact a human being, because it's a fake human being, because it's like a clickbait site. But that site had found that this new technology was trending and they trained that site in it. They trained Chat GPT in it. And then it just took out their article.

ADRIANA: Damn.

JENNIFER: Don't put stuff that's not published or public in a public AI, whether Bard, it's Bing, whether it's Chat GPT, you don't know what's going to happen. Pay for it. If you want to play around with it, maybe. But even playing for fun, it still has an environmental impact that no one seems to care about.

ADRIANA: Yeah, I'm so glad that you're bringing that up, because the more we talk about it, I hope the more it gets into people's brains that we cannot take for granted the things that we use. I mean, even Google, right? The fact that you're googling stuff, I mean, there are servers running things somewhere.

JENNIFER: Google tends towards green energy more than the largest one, AWS. Leslie Miley, who was speaking as himself, but does work at Microsoft, at QCon, gave this wonderful in his keynote, just a really impactful talk. And he analogized the growth in AI to the US and maybe one of the world's largest infrastructure projects, which was the interstate road system, which specifically created red lines, which specifically was like, strategically kept people of color from being able to use buses to enter New York City and work, which still to this day in San Francisco or that area, the Bay Area, where we have all this, I assume is the most inequitable place in the world, where kids are three times more likely to have asthma, severe asthma, by six years old because of where these roads were built. So this idea, and it's happening again with the access to electricity, the access to data, the pollution, the access to clean water, because that's what's used...water is being used to cool data centers and it's happening around the same lines and stuff. It has this ability to create this great inequity and without diverse people and thought on your teams, people aren't considering it. And we know, again, one of those statistics, just like happy developers are more productive ones, more diverse teams are more innovative and profitable, but we've got our masks over our eyes again and not thinking. And that's where we are.

So sorry to end on a bummer of a note, but let's think of the...I'm always back to there's a wonderful, Agile practice called Consequence Scanning from Emily Webber and Sam Brown. And I just recommend just doing a consequence scanning sometimes. Thinking about it's just simple questions like if this scaled, who wouldn't be able to use it? What are the good intentions we weren't thinking about? And what are some negative intentions or consequences that could happen because of this tool? This is one of those things with open source that even more because if you're being truly open source, your code could be used, I don't know, making another Kiwi Farms or another hate site. Hate farm, that's the consequence of open source. You need to think early on, "Okay, what if someone used this for evil?"

ADRIANA: Yeah.

JENNIFER: Negative consequences or what are the environmental consequences?

ADRIANA: Absolutely. And I think that's really great food for thought. And I hope folks who are listening to this really take this to heart. And next time they use a tool like Chat GPT, they think about the environmental impact or even when they're using resources on the cloud, think about these things because it's so important and we've only got the one planet and time is ticking.

JENNIFER: And don't trust the news. Like, these jobs like mine as a tech storyteller are not going away. We need more people. We need more people explaining in different ways, in different languages and different jargon so everyone understands what is being built and why and what the consequences are. Because a lot of people are just using.

ADRIANA: Yeah, absolutely. Well, thank you so much, Jennifer, for geeking out with me today. Y'all don't forget to subscribe and be sure to check the show notes for additional resources and to connect with us and our guests on social media. Until next time...

JENNIFER: Peace out and geek out, y'all.

ADRIANA: Geeking Out is hosted and produced by me, Adriana Vilella. I also compose and perform the theme music on my trusty clarinet. Geeking Out is also produced by my daughter, Hannah Maxwell, who incidentally designed all of the cool graphics. Be sure to follow us on all the socials by going to bento.me/geekingout.