
Loading summary
A
Hello, everybody. This is Marshall Po. I'm the founder and editor of the New Books Network. And if you're listening to this, you know that the NBN is the largest academic podcast network in the world. We reach a worldwide audience of 2 million people. You may have a podcast or you may be thinking about starting a podcast. As you probably know, there are challenges basically of two kinds. One is technical. There are things you have to know in order to get your podcast produced and distributed. And the second is, and this is the biggest problem, you need to get an audience. Building an audience in podcasting is the hardest thing to do today. With this in mind, we at the NBM have started a service called NBN Productions. What we do is help you create a podcast, produce your podcast, distribute your podcast, and we host your podcast. Most importantly, what we do is we distribute your podcast to the NBN audience. We've done this many times with many academic podcasts and we would like to help you. If you would be interested in talking to us about how we can help you with your podcast, please contact us. Just go to the front page of the New Books Network and you will see a link to NBN Productions. Click that, fill out the form and we can talk. Welcome to the New Books Network.
B
Welcome to the New Books Network. I'm your host, Michael Lamagna. Today I'm joined by the author of Algorithmic Saga, Understanding Media Culture and Transformation in the AI Age, published in 2025 by ETH Mindscape Publishing. It is clear that as a society we are becoming dependent on digital technology. The digital media we engage with every day is transforming our lives. From the moment we wake up until the time we finish doom scrolling at night, we are constantly engaged with digital content. Digital media is a powerful tool. With access to information at our fingertips and the ability to connect with people around the world. However, there is a downside to the current digital media environment. Add into the discussion how artificial intelligence will exacerbate the current digital media landscape, and having a better understanding of how to effectively navigate this environment becomes essential. Joining me to discuss this book is the author, Dr. Muhammad Atiq, a fellow of the Higher Education Academy. Welcome to the podcast, Muhammad.
C
Thank you so much. Thank you for having me.
B
Before we discuss your book Algorithmic Saga, I was hoping you could tell us a little bit about yourself, your background and your career path.
C
Sure. First, thank you for having me. I really like this network books and actually I am from Pakistan. I studied my journalism and the mass communication in my bachelor, bachelor and Master degree there. Then I practiced Some journalism as a current affair news producer. I also came to Houston public media around 11 years before in Houston. Then I traveled to China. I lived there for a longer time and right now I'm in New Zealand. So like I work and the practice journalism and study mass communication and like the publications and some like reading and authorship was the kind of when you do the doctorate and studies so you are keen to read like authors like Marshall McLuhan from Canada and so many like United States authors and especially they have worked with this. And then the media changed rapidly in last 20 years. It started from a newspaper and then broadcasting and then TV came and social media came, YouTuber came, podcast last 20, 25 years, a couple of decades. So that's Dr. Me fully to come to write something. And I travel between the continents and worked in different places. So this is like kind of my small journey then watching and practicing how the media shape the things, how we get influence and rapidly. Now the age of AI is going to change everything that we changed last 20, 25 years. So people are afraid about it and people like little astonished and they are also curious about it to know like what's going on and how the things are going on and where we will stop or either it's a constant change in the technology and how the society will take this change either positively or either negatively. So this is my small curious journey towards like my media studies, journalism and the communications.
B
So you definitely can see what sparked your interest in understanding digital media here and that journey it took. So let's jump right into it and think about the digital media landscape and really the algorithms and the roles that they play. So what role does an algorithm play in shaping the news that we read and see every day?
C
Yeah, this is really a good question. We need to understand how the marketing companies and the the advertisement is working now. So now we have machine, now we have the algorithms so that help us for our advertising. For example, whenever we download any application we open the Facebook or Twitter or ads or any social media applications we open or any mobile applications we download. So from the marketing point of view, the companies need our data to market their products. Now this thing we need to understand for this thing they need to train the algorithms. Okay, so they take our information, our age group, our demographics, things where we belong to which language we are using, which color we like it and what we are scrolling on the apps and the website and everything. So this is all our information. And consciously, unconsciously, we want to give them to the third party by agreeing their rules and terms and all the things that they say that before downloading this app. So we will take your data. So we are happily want to give them. We do not want to read like 20 pages terms and the conditions. Okay, so this one, this our data is going to use by that advertising companies. Then for example, when the new phone launches. So in different countries you will see the same company have a very different colors. So how they found that okay, this country is more likely to go for a gold color or this country is more likely to go with a pink or blue or white or black. So these all is our data is our likes and dislikes when we are using our phone or social media and our applications or whatever we are looking. So the algorithms take our history, they store it and they sell to the third party. Even before these things banks were using and telephone companies they were using. Okay, this is the person he has the X amount of value and he can be released for a. Are you looking for a good house or expensive villa or expensive car? So before there was a telephone marketing company and they reach for that. So now the application and the websites they reach to their customer in a very software in a soft channel. You can say in a soft communication without realizing you. But this is also good for me, good for us as well. What we are looking. And then the algorithms is giving the very similar like applications or similar like opportunities to us. For example, if I'm looking on the Amazon some books. So same category books will pop up after a few times to me. And the same kind of a phone I am looking on the Amazon or ebay or Target or somewhere. So after this time my application will show me just like the Facebook algorithm show us. So this is all is. I call it the algorithm saga. So how the algorithms is playing with our choices and then the recommendations come. So from the psychological point of view, whatever you see, you believe it. And as the seeing is believing. So whatever we repeatedly see. So we like to incline to that product or services. We want to click it and we want to scroll down. And now in a day one person consume this social media or playing with the screen time is hugely like stretching to seven to eight hour in a day. Either it's a kid, teen or a mature adult. We are giving so much time to the screens. So this is a rapid change with the technology and the algorithms and the machine learning they are making for their marketing and the publicity. So unconsciously or consciously our data is going to be used and we are happy to do with this. So here the things comes either this data is going to manipulate our thinking, it is going to change our reality. It is going to change because when the, you know, the recently the chat GTP last 2, 3 years has come. So I have realized people have not thinking, they have finished the thinking. They just want to type the question. Okay, what the answer is coming. They believe it. Okay, the machine and the recommendation they are best and actually they look like a best. And being a teacher, I also feel like in the class I'm just a facilitator. No, there is no more the need of a teacher or professor in the class such as GTP and some others like application AI tool. They are the best and actually they are the best. But the problem is that we are not thinking it. We are not thinking deeply. Like why I'm reading this, what is the question? And should I go with my own conscious? Should I go with my own choice or not? Or what is the recommendation is coming from the AI tools from the chat GTP or whatever the writing stuff is there. So the dangerous thing is that the human beings are going close to think. They are just putting everything on the machine. Machine is recommending me, I'm asking the questions. I'm going for a vacation. So like, okay, so AI tools recommend me which clothes I should wear, where I'm going. Okay, there like it's look like a 70 to 80% is very much accurate if I'm traveling somewhere or I'm going to read something thing or some which car I should buy or which in a new country. So even I asked for new recommendation in the New Zealand that okay, like this is the kind of island country and tropical weather here more rain. So I also asked the recommendation from the chat GTP to give to me. So chat GTP algorithms and all the things they were quite correct. And they said okay, this is the first country isolated from the other continents. So like the same stuff I have been reading about this country. So these all are the kind of recommendations. But then the questions comes about either the algorithms like the biasness or they are right or wrong or how their recommendation is good for you. So then the critical thinking and this kind of stuff. But the questions come with this like technology change algorithms. So either we need to think first and then react and then go for the recommendation. And even some of my friends and the early teen friends, they also asking about their spouses and the girlfriend, the boyfriends with the chat gtp. So that's really amazing question and that, okay, so the machine have some standard criteria and they recommended you that okay, yes, you should go with this boyfriend or the girlfriend. So even though our emotional things is recommended by the algorithm or the machine it's suggesting for us and should I leave her or should I leave him? So that's kind of recommendations are really astonishing interesting as well curious and in a way that how we are going to seeing and being this very rapidly changing world in our emotional setting, official setting, relationship study, work businesses and somehow they are really great choices. The single mind cannot think about this. Whenever the chat GTP algorithms they. They generate some to you. So I, I feel like it's very stuffy content in a normal average mind in a one page story, two page story. You can only remember one or two character at the moment. And this is too much stuffy. And how I can consume my mind, how it can consume the information it's giving to me. But I am influenced so far. But I think whenever the new technology, new things come, it becomes saturated at the moment. Then we after some time we get used to it and then we come back to the basics in a point of view. Okay, no, we should give some time to our relationships or our friends and the family or something with our body or some with this universe. So after the situation of technology or something it happens in our market, in our life then we think literally not at the moment, at the moment we just want to jump in the fashion actually we want to adopt that fashion. We want to be there that we should not miss something. So these are the main things going on with our life and the recommendation either with the relationship, business, work, setting, traveling, everything is this machine is recommending you. So I appreciate everything. Now the new model just came from the chat GTP5 and it says so the wonders are here and there. So I'm curious and I'm afraid as well as what is going on. So this is all is going on with us.
B
Yeah. No. One of the points you made was spot on. And that's the idea that we need to still critically think about everything that we're consuming online. Because even though that algorithm is tailored to us, we need to really move beyond just that and really think do I need this product? Should I take that relationship advice? Should I be consuming this news? But one of the problems with the modern digital landscape is that these algorithms are excluding important information. So how can we work to avoid that confirmation bias that's developing right now right where we're fed that steady diet of just information we constantly agree with to make sure that we are critically thinking, consuming multiple perspectives so that we don't become this society that is so fractured.
C
So yeah, the question become the confirmation biases. Now the machine is recommending you something. Okay, so there are some country from Scandinavia I heard about the Denmark is the country they made a law for the teenagers until the age of 16 they should not use or take health in their academics from the AI stuff. I really appreciate that law if it is implemented on the whole country or city level. There are some countries they are thinking about this. So confirmation bias is something that. Okay, recommendations is coming to you usually. And whenever I search on the different platforms. So there are something. The recommendation is coming. And in the US Context the recommendation is the biased. And even there are so many like Amazon and so many other companies, they also agree with that. Okay, our algorithms they have some confirmation about even the color tone as well from recommendations the white and the non white people. So then the biases is there. Being a human being, we have the biases in my eating, I have a biases in my ideological. I have a biases in my everyday. We take being a human being, we play with the biases and they are natural. But when the machine is saying something you with the huge data that like millions of people like to wear the white shirt or millions of people like to wear the or buy that specific phone of the company and the millions of people recommended this. So like this recommendation and there is a company behind that. So I am likely to believe, because my mind is not that kind of critical as a human mind that the machine with the huge scientific survey their recommendation is coming and even the recommendation is coming for my spouse and relationship as well. And I am putting the data and the machine recommended me. Oh no, I don't think so that spouse is favorable for you for a long run. So that that kind of stuff is tricky. So the confirmation bias is there machine and their algorithm turns is there. And now the many researchers and the many companies even in the US and outside the US they recommend that yes, we do have the problem with our algorithms and the data it can be biased especially from choosing the skin tone, the colors and like whatever the information we have stored earlier. So it can have some inclined to words. So yeah, so then the things come how critical we are. Yes, we need some legislation, especially from the teenage point of view that whatever the content they are consuming, how much is healthy for them. So from the parental point of view, yes. Now the social media accounts they ask you are. You are over the age of 18 and when you were born or something. So yes, the companies they are also doing. But from the parental point of view, we need to be a little careful about the use of the technology and how much we need to use it to keep our mental health. So whenever we are screened to heavy use of screen and then they are like so many millions of messages or some advertisement is pop up. And then our mind comes to that one more phenomena is the information overload or fear of missing out. So the fear of missing out. Okay, everybody is buying thing or somebody is having the very like good vacations and I am in the very boring stuffing assignment is working. So this call this all kind of a feeling of fear of missing out. Because you are looking like on every second is best of the best is coming in front of you from Instagram and from the YouTube advertisement and TikTok. Like they want to show you the best shot. Nobody will show you that I have a pimple on my face. What should I do? So everybody will show you a very glowing face. Everybody will show you the best of the best. You are eating, having vacations and traveling here and there and buying like a big villa or earning millions of dollars and blah blah, blah and so on. So like as a human being we feel something fear of missing out. And this information overload is again like touching our dopamine in our mind that something we feel so many clicks and the likes on our posts. So we feel happy. But this happy is not normal in a day. You can be happy in some few time. Then you are serious to do your job, to do your work and your cooking and stuff and traveling. But every clicks and the likes in the comments if punching your dopamine and you feel player or something. And then the next hour you see something that is really astonished and hunger you see or something you see that is not the normal. And you see some violence on the screen or something not good. Then your human player is very much distracted at that moment. So you're in every second your mind is playing like happy sad happy sad happy sad. So like this kind of a behavior. I think is this the psychological or mental health specialist. They. They have the research on that that the heavily screened time or. Or looking for the. Or working with the machine is. Is not healthy for your body and for. Especially for your mind from psychological point of view. So like then the responsibility come how much time I should give to the screen and how this my attachment. No, if you remember like before. Like 20 years before. You often miss your phone in the restaurant, in the public buses because it was not part of your life. Like more Than two, three decades before. So you usually missed it and then you come back, oh, I lost my mobile in the bus or in the restaurant somewhere. But no, this loss of the phone is not common. You always before, before even the waking up. Our first, we want to see our phone. So we don't want to see our spouse on the bed, but we want to see our phone first. So this is the phenomena. And we don't want to see our child there at home, but we want to see where is my phone. So this drastic or this dramatical change that we don't care about our child where they are, but we more concerned where my phone is and waking up. So these are the dramatical changes and the things with the technology we are playing, we need to recall ourselves. So the phenomena is that we usually say back to the basics. We want to give some time to our relationship. We want to give time for our work and the business. Bread and butter. We want to have a good sleep as well. And we want to have some space for ourselves like exercise, yoga or something like. So we need to make some balance in our life that we want to see our children playing with us and some physical activity, some like after one hour working, then playing or like talking or something. So if we are more keen to talk with the machine, so what is happening at the end? Our skill set is getting better, but critically, our relationship is getting poor with the human beings. We don't know how to talk with our next door neighbor, so even we don't know his or her name. So what is going on? So our skill set is getting better with the technology. Okay, we are more keen because this is very obvious. Whenever you give more time to any skill, work or ability or business, so it will get better and better. But if you will give less time to the human beings, so your relationship with the human beings will be poor. Definitely. So these are the some outcomes with the technology positively or negatively.
B
Yeah. And you bring up that excellent point of that fear of missing out and we could see how like social media influencers play into that, especially this new trend of those dangerous vacation pictures that people are now taking. But I was thinking as you're talking about the algorithm and the change to our psychology and how we approach technology, how does privacy fit into this? Oh wow.
C
This is more dangerous. Yeah. Okay, so privacy setting, yes, this is a big concern. Like when my data is taken by like today, what I was eating and where I was, my location and even the which book. So whatever we post on the social media somewhere, it is Going to store in my application. And even when we play with the chat, GTP and some like AI tools they know our email, they know our address, they know our gender because we are taking the helm. And then according to their information, their recommendations, in their writing, the AI tools give to us. So privacy is a big concerns. I still remember in the France in the mid-90s, they put cameras on the street, on the traffic lights and whenever you do some violation of the traffic signal it will take a picture and it will send at your home home. The problems came okay, it was fine because I made a violence on the street and I crossed the signal so it should be fine. But the pictures show there is somebody sitting next on your on your seat. Either is your new boyfriend or either is a new your girlfriend. So picture came. So the okay, the citizen went to the local court like this is our privacy is going to disturb that if my new spouse is also showing in the camera and my current spouse is saying to me there is a problem who is sitting next to you in the car. So okay, so this was the example like the government then they change in a way. But now the cameras on the traffic lights are everywhere. So that was the key example. When the information is taken from you, it is taken your privacy as well. Either from the social media application or either from the website or login information or everything. So people are also conscious about it. But the key thing is that whatever I am posting either is I want to show off my life that I am having a vacations or I bought something. So it's a media. The social media is kind of a show off as well. It give us the opportunity to show off. Okay, I got my PhD degree, I got my grades so I'm having the vacations or I bought something luxury item or something. So then the jealousy factor fear of missing out it's pop up in your friends and family members when they are looking oh my friend or my like this colleague has got promotion or new company, new place, new house, new car. So like this is the stuff with the okay, your privacy is going out. This is a big question but here should be some regulation point of should be some regulations that this when we accept the terms and the condition from the applications and when we get it on the post so we are giving our everything. Who I am, what our my likes and dislikes are, where I'm living, what I am reading and somehow it is sharing is also good in a positive way. Okay, some new book and some new opportunities, some information you are sharing positively but somehow when we are show off the things, it's also making the factor of like fear of missing out in others as well. And some jealousy factor. But these are very common human behaviors. Whenever we see on the road some people, they are better than us and financially or in some different settings. So these are the different things things. So coming back to the privacy concerns. So privacy concern is becoming more and more. No, there is nothing hide from you. Either you are official worker or either you are working in the private or whatever your private life is because you are using your social media and you are telling where you are traveling, what you are eating, where you are going, what you are buying. So there is no privacy so far. Yes, it will be become dangerous when everything you put it on the public, anything you put it on the social media. So from the like publicity or market marketing company point of view, they need your information to market their products. No, the the product, the marketing is not from city to city, country to country. No, we want global market. No, quality does not matter. No, your customer matters where they are. So from the marketing company point of view, they want to ensure customer first and then they want to launch the product. Before we used to have the thing, okay, we will open the restaurant first and then we will attract the customer. Know that things have very much changed with the algorithm things. So first you want to ensure your customer, your readers as well, from the book's point of view. And even the publisher they ask, okay, who's your reader? So we happily say everybody is my book reader. They said, no, you need to be specific. So either the tech is your reader or either the people who are in the science fiction or technology enthusiasts, they are the readers. So know the companies, the people and the consumer. The big companies, they want to ensure the customer first. So customer and globally. So then they need their data, your information, where you are, what is your happening daily life here and which age group you are, the gender, the everything to protect their markets. So from the privacy question point of view, no, our privacy is no more privacy. Everything is public. It's just like the public public say that. Okay, this is the. Your data is the public public. So our everything is public now. There is no more privacy on this planet earth.
B
Well, let's bring in then the role of AI. So what role will AI be playing in not only the digital media that we're using, but how is that going to change our digital media habits as well?
C
Yeah, no, before in the early like, like 10 years or first decade of this new century, there was this More stuff on the. We take the picture, we post on the social. With the AI stuff and the things. No, because the recommendations are becoming more and more. So we are giving more time with the AI things students, they are taking help from the AI to write their assignments. And it is boom. And it's excellent. No more need of a teacher. Okay. And being a teacher, I am afraid with my job as well. Among others, the people who are afraid to lose their jobs, especially the attendants tell us or like teachers as well. The writers and graphic designers. So many like people, they are afraid. And AI and the website developer designers and the list go on. So AI taking their jobs. But I think it will also create some new jobs as well. Who can use better AI for a better product and the services. Better marketing, better publicities from the positive point of view. And like Gnome, there's stuff going on. So many trainings, workshops, how to learn AI and the people. And the trend is also changing. People love to go for as an entrepreneur. Not like a 9 to 5 job. People want to do hybrid kind of a work. Like one day going for office and next day working from the home. So this pattern is also coming. And you can give more time with your friends and the family. And you can have more time with yourself as well. So like AI is helping us us dramatically changing so many things. And the human beings are really afraid from this. That if it will take my job, so then what I will do then I need to learn new skills. So for the better recommendation from the AI is that we need to learn AI as well. So we can better train to get more benefit for our children as well. And the children Creativity, problem solving things. And the children their own things should be alive. Their consciousness should be alive. Their mind should be alive. Not like blindly recommendation getting the recommendation. And even my students they copy paste it and they never read it. Like when I asked the question, I said, oh, this is clearly. And even I can see this is not a human being written stuff. It's too stuffy. So the teachers can very easily they can understand it. And know that the school colleges they also so having some similarity point of view. It should not be more than 15% or 20% in your academic writing. The people, the university's academic school, they are also working on this. But it will take some time that where we should make it end or where we should stop it in academic point of view or in our personal life. So from the AI tools point of view, we need to learn about it. And then just like when there is some Horse. So horse we need to learn for a horse riding computer we need to learn and AI the technology we need to learn to best use it. Just like driving a car and even know the driverless car has coming here. I being in China for a long time. So in China you can see the taxi service, the kind of Uber is driverless and some trucks are driverless here. So it is also taking our jobs as well. But we need to again with the technology, it can better serve us if we can better learn the technology.
B
Yeah, yeah, that's such a great point. Now where do you see us going in the near future with the digital media and its impact on society?
C
Yeah, it's a great question. Because the things are changing dramatically. We are giving more time to the screen playing with the video games. Digital cultures, they are changing. And this I have discussed in my book that. Okay, so we find communities on some sharing communities. For example, in the US context the Reddit is a very good company. Reddit, you can ask the question. Cora, you can ask the question. The things with the digital media and AI are going to change rapidly. No, I think we like to watch the movies and the stuff on our laptop. Less people like to go for cinema. I think so. But country to country this culture will be very different. And the AI with the fashion and the designing, so we are taking help with this. But everything is rapidly changing and we need to learn the skill that how we can better use that skill to enhance our skill in our other graphic designer fashion or whatever the work we are doing, we are playing with the technology. Even the writers I have seen, we take help in writing the book from the AI thing, the stuff because the information we needed so. So but that information in the books we take from the AI so far is too stuffy. It's not for the human being. One of my friend from the United States, he asked me that the Hollywood writers, either the AI will change the Hollywood writers as well. I said definitely it will change it because it is going to have some emotions as well with better dialogue writing or something. So Hollywood writers, they need to be careful. So the threat is everywhere. So Hollywood is a big, big stuff, is a big giant actually. And it's a part of our life like the movie stuff and the going on the weekends or like playing like watching with the fan and the family and some like with the fantasy thing and action movie science fiction. So the, the stuff is, is changing. So everybody, the people in the different setting of the work, they are conscious about like how I can improve my work with the help of AI or I need to learn the AI because it is taking our job here. Even in the New Zealand. I heard about one of my friend, he's a liar in the practicing. And they said we have a conference meeting and something like that. That now the people can write better with the AI stuff. The contract letters, the AI Very good, okay. All the terms and the condition AI can recommend. So where is the need of a liar and the advocate where we are standing only in the front of the judiciary when you come to the court and something that. So even the liar stuff, the advocate, the judiciary they are also concerned about. Because when you give to the AI staff. Okay, please make this contract with the company A and the B. So every sector they are thinking about it. But from the positive side, it means that because the society is changing, changing with the changing of the technology. It means there is some combination with the technology and society. Society need to learn some new skill in somehow, in some part that should not disturb your natural life, your natural phenomena. For example, enjoying a coffee like when you taste it and you remember your past, that okay, I think this shop or this brand, the coffee, the taste, this is the real life. When you enjoy the coffee, when you enjoy the food and you enjoy the reading a book and. And you enjoy the movie. So these are the human feelings. But when you put in a copy paste stuff or tool stuff or something, then yes, you have done the work in the few minutes. But so far it is out of the emotions. And so the digital media that is all about the social media and the video games and so much stuff and the movies, and this is a part of this one. So again, at the end, we need to make some balance. Either digitalization or either AI stuff or the technology, how we adopt it on a daily basis.
B
Yeah, and you do a great job in the book talking about digital media's impact on fashion, art, commerce, online gaming and its psychological impact. But since we're on a podcast, let me ask you about podcasts now. How are podcasts changing public discourse course.
C
Yeah, thank you. It's a good question from the American context. Yes, podcast is still like popular there, but the podcast is not only with the listening. Actually here on the YouTube long talks, people want to listen. And the good thing is that while you are like doing exercise, playing or cooking or doing your stuff, you can learn, you can listen to a one person point of view in a calmly. The podcast culture. Before it was only from the listening. When you are driving or on the long road, you are driving. So you want to listen some talk, some expert talk? No, this has shift to the. On the YouTube, on the TikTok. So you can like listen to hours of a podcast. So the podcast is. Is also a good thing that. Okay, you want to settle some. You want to sit for a while and you want to listen for a one person perspective. The same trend on the news when there are four or five people are like talking. So there is like that talking is cross talking to each other and we do not like it. No, we want to listen it because we want to settle down with the things and listening the point of view and other conversation. Either it's according to my technology books or if I'm working in the fashion setting or related to my industry, I want to listen to expert talk. So that expert talks from before we used to listen in the driving only and when we are in some safe setting or on the radio. So now it has shifted to the social media as well. And we can see our expert as well. They are talking and they are talking and you feel really great if somebody invite you just like. No, you invite me to talk about my book. So I want to give in a very consciously, in a very honest way to okay, what was my experience and emotions while working on that and what I have learned and what I want to deliver and how I can express my lessons learning to my students and my public and to my readers as well. So this has changed dramatically and it is becoming more and more. Even the audiobooks. Now the audiobooks, I think they are more than the reading books in different countries setting people love to read like hard book, I think is more common still. It will be different country to country. And then recently the audiobooks are becoming more and more people like to listen to audiobooks as well. So the podcast culture or something that I want to listen in a deep way. Because now the technology give ease of business. So while walking and in the driving, we can listen to the books and we can read the books as well. And we can listen to the talks. So because convenience of the technology, this is becoming more and more the podcast culture as well.
B
Excellent, excellent. Now I want to talk about the structure of your book real quick. Now, at the end of each chapter, you offer takeaway messages as well as some excellent questions to guide discussions. Now, do you see this book as something that could be adopted in the classroom?
C
Yeah, this is my dream, that my book should reach to the libraries in the classroom and the people ask the different questions. Yeah, I made it like this. That okay, if the teen they read something so they should critically examine some chapter. When you read the questions or the questioning is something you want to explore more more so when you are curious about something so you will learn better this your curiousness will lead you somewhere the question, the asking and from the classroom setting or either you are working or sitting in the libraries or you go for some books bookstore. So then when there is a question so question always irritate you to go for some answer there is a question. So when we find something so we look for some questions some curious. So that will be a great idea. If the good books should be adopted in the classrooms and they go for in a different settings people have some arguing questions. So when the society give you the option for arguing some questions so it means it's a healthy society, you are going for a dialogue either is with the technology or AI stuff or in different settings. So the dialogue based on the questions it's a happy the beginning or it's a live society. Actually it represents when you have the question listenings. So this yeah, I will be happy that my book should be explored in a different way either in the classroom or in the libraries.
B
I think the way you structured each chapter lends itself nicely to a classroom environment. So Mohammed, I've taken up a lot of your time and as we wrap up our conversation today, I was wondering now that the book is out, what are you working on next? What kind of projects are you engaged in? Are you going to continue to explore digital media or is your research going to head in a new direction?
C
Yeah, I'm working for a second edition of this book and yeah, so second edition of this book because why it forced me to work for a second edition because the technology is changing so fast. So my book will get outdated very rapidly. So some of my colleagues, some author they recommended me. You should like revise it or work for a second edition. So no, I'm working for the second edition because the chat GTP5 has come. I am outdated. You are outdated as well. So everything is becoming very sharply outdated. So I'm working for a second edition. It will be soon come and I will let you know. But at the end the kind of up now like the message I want to get give in a way is that we need to make our life in a balance. We should give the technology to learn our skill, our business and to polish our daily like bread and butter we earn for. And we also need to make the balance with our friends and the family, with our health as well and with our sleep so these are the four areas we need to make balance with the technology to give a healthy society to prove as a healthy society. Society.
B
Well, I'm excited for that second edition, especially with all the changes to generative AI. And I absolutely agree there needs to be a balance between technology and our actual real lives. Mohammed, I want to thank you so much for joining us today to talk about your book. It's been an absolute pleasure.
C
Thank you so much. Thank you for having me. And I really like this book, Networks, and it is expanding. You guys are doing a really great job.
B
Job.
C
So I really appreciate that as well.
B
Oh, thank you so much. Well, I'm your host, Dr. Michael Lagna, and thank you for listening to the new Books Network.
Podcast: New Books Network
Host: Dr. Michael LaMagna
Guest: Dr. Muhammad Atique, author of Algorithmic Saga: Understanding Media, Culture, and Transformation in the AI Age
Publication: Atique Mindscape Publishing, 2024
Date: November 1, 2025
The episode centers on the profound impact digital media, algorithms, and artificial intelligence (AI) have on our daily lives, culture, and social fabric, as explored in Dr. Muhammad Atique’s new book, Algorithmic Saga. Host Dr. Michael LaMagna and Dr. Atique discuss how algorithms shape personal choices, media consumption, social behavior, psychological health, and privacy, offering insight into both the promises and perils of the AI-driven era.
“Now the age of AI is going to change everything that we changed last 20, 25 years. So people are afraid about it and... a little astonished... curious...where we will stop or either it's a constant change in the technology and how the society will take this change...” (03:34, Dr. Atique)
“...psychologically, whatever you see, you believe it…we are giving so much time to the screens. So this is a rapid change…unconsciously or consciously our data is going to be used and we are happy to do with this.” (07:55, Dr. Atique)
“Even some of my friends...they also asking about their spouses and...with the chat gtp. So...the machine have some standard criteria and they recommended you...even our emotional things is recommended by the algorithm or the machine...” (10:45, Dr. Atique)
“The confirmation bias is there…now...many companies...recommend that yes, we do have the problem with our algorithms…and the data…it can be biased especially from choosing the skin tone, the colors...” (16:20, Dr. Atique)
“Our privacy is no more privacy. Everything is public. It's just like the public public say that…Your data is the public public. So our everything is public now.” (28:53, Dr. Atique)
“No more need of a teacher. Okay. And being a teacher, I am afraid with my job as well.” (29:40, Dr. Atique)
“Society need to learn some new skill in somehow, in some part that should not disturb your natural life, your natural phenomena. For example, enjoying a coffee... the real life.” (34:38, Dr. Atique)
“The good thing is that while you are...doing exercise, playing or cooking...you can learn, you can listen to a one person point of view calmly…So because convenience of the technology, this is becoming more and more the podcast culture as well.” (38:01, Dr. Atique)
This episode offers a lucid and wide-ranging exploration of how algorithms and AI are revolutionizing media, social behavior, and even personal identity. Dr. Atique emphasizes a critical, thoughtful embrace of technology: leveraging its benefits while guarding against its excesses, biases, and intrusions on both individuality and society. The overarching message is the necessity of balance—between digital innovation and authentic, human-centered living.