
Mediha works with her team to find the equilibrium between ethics and innovation while regulating the vastness of the internet (Photo: SooPhye)
The year 1983 marked the beginning of many great things — from the birth of local sports legend Datuk Nicol David to the founding of Malaysian automotive company Proton and, of course, the internet. The global interconnection of computer networks completely modified our relationship with knowledge and communication, ushering in the Information Age. As quickly as the brilliant novelty of having everything but a click away revolutionised our way of life and facilitated the rapid growth of humanity, so too did this double-edged sword give rise to opportunities for cyberbullying, hate and exploitation.
With governments and lawmakers in a perpetual scramble to catch up with technology, Communications and Multimedia Content Forum CEO Mediha Mahmood is among those leading the charge in the country to help transform attitudes around how we consume and share media.
“Imagine a jungle filled with many creatures: some bright, some colourful, most of them loud. The Content Forum is the Jiminy Cricket in that wilderness,” she explains. The industry forum, registered under the Malaysian Communications and Multimedia Commission (MCMC), consists of broadcasters, advertisers, creators, civic and advocacy groups within the content ecosystem. Together, the group works to establish the best standards that are applicable to content in Malaysia at this time. “We try to complement legislation, but regulation doesn’t just come in a uniform and handcuffs. It also comes from an internal understanding or motivation to do what’s good and not harmful to others.”
Having taken the helm of the self-regulatory society in 2021, this mother of three is setting her sights on fostering long-term positive change in an environment where breakneck innovation continues to sprout more and more ethical ambiguities by the day. She is currently heading up the latest revamp of the forum’s Content Code, a collection of best practices and principles for any and all players within the content landscape.
Into the wild west
mediha_forum.jpg

Far from the digital and multimedia world, the Kuala Lumpur-born Mediha’s childhood goals had always been poised among the clouds. “I wanted to be a pilot, but I was born a little too early, because at the time, the national carrier did not accept women as pilots. So at age 15, when you were to decide which academic stream you would be in, I had to pivot.”
Several consultations and aptitude tests later, all arrows pointed to pursuing a career in law, where she learnt to view matters through a structured lens of black and white.
“But when I joined Astro and looked at content regulation, I realised there are so many shades of grey — what’s offensive to me might be completely acceptable to you. That perspective allowed me to step into this ‘Wild West’,” she reminisces.
After 11 years and a milestone birthday at the pay-TV provider, Mediha began to question if there was more out there for her, and moved to the digital arm of AirAsia (“I did end up working at an airport, just not as a pilot!”) Eventually, she received a call from the MCMC and Content Forum to fill the leadership position.
“Content at that time had transformed into something completely different from what it was when the forum was first founded in 2002. All content back then was a one-way street. You consumed it as it was given to you via television, film and radio, but now it’s more like a multi-storey superhighway; it’s chaotic! There needed to be a rebranding of sorts, to make the Content Code more relevant,” reflects the CEO, whose first big move was to spearhead the revision of the guidelines.
Originally conceived in 2004, the Code underwent a minor rewrite in 2020, but these amendments fell short of truly encompassing the rapidly evolving scene of the time, something Mediha took into account with the 2022 revision.
“It’s like a living social contract, a playbook of principles for everyone dealing with content, whether you make, share or are part of it. There’s collective ownership of [the Code], and it sets the standards and norms for everyone,” she says.
The crown called content
“Content creator” is becoming an increasingly commonplace occupation. Influencers and key opinion leaders dominate our social feeds, and the sight of someone filming daily vlogs or food reviews with a phone and ring light is nothing out of the ordinary. Yet, for those of us without a background in media or journalism, how well versed are we in what should or should not make its way onto the web?
“The onus used to be on broadcasters and advertisers, who know what is ethical or responsible due to their decades of experience. But the person in the street who’s armed with a mobile device and can spread content pretty far and fast may not be equipped with the same kind of understanding,” says Mediha.
Within the organisation, its 66 registered members are required to comply with the agreed regulations, and can be compounded or fined in the event of a breach. For everyone else, compliance of the Code is purely voluntary, although there are governmental bodies like the MCMC and police to crack down on legal violations.
uitm_surfs_up.jpg

With so many stakeholders at play, the only true way to know what needs to be included, changed or amended is to speak to everyone involved. During the 2022 revamp, the first draft was put up for “free-for-all” public consultation, where anyone could write in to suggest what they felt was missing. Mediha recalls how, in the midst of the pandemic, the most pressing concern at the time was the very sensitive matter of suicide-related content.
“Previously, we only thought about how the media reported suicides, and there are ways to do it that do not lead to contagion. But the response we received from the public was that we needed to make sure content creators also understood why these rules are in place,” she says.
For example, even if news outlets refrain from recording, passers-by might still attempt to film an incident and, worse, upload it on social media, which could have devastating consequences on those already experiencing suicidal ideation.
In the most recent feedback drive for the current rewrite, migrants and refugees have flagged how issues relating to their communities are being reported, as it can lead to harm against them if done irresponsibly. The current revamp is also motivated by the increasing prevalence of generative artificial intelligence (Gen AI) as well as changes in legislation, including amendments to the Communications and Multimedia Act 1998, Sexual Offences Against Children Act 2017 and new Online Safety Bill 2024.
“For us, the Content Code is not something we’re meant to be building ourselves. The industry might know what’s best, but we need voices from the public, especially because they are subject matter experts in things we are not as knowledgeable about. Rights for women, children and people with disabilities were a particularly big thing in our last revamp,” emphasises the CEO, stating further that she ims to review the Code, regardless of whether changes are made, every two years.
Shifting tides
Pessimists may be quick to point out the failures of a self-regulatory system: After all, when the internet has historically been exploited for harmful acts and the selfish gain of certain users, how can we expect anyone to truly behave without a stick to keep them in line?
“[Self-regulation] does make the job quite tough. Compelling it goes against the very idea of it being voluntary, right? There is this race between relevance and responsibility. We do our very best with the resources we have to create awareness about [the Code], do advocacy work and collaborate with partners.”
There are three main camps of reaction to the Content Code: those who know the lessons and are willing to change; those who do not know, learn it from the Content Forum and adopt it; and the pockets of people who know what the standards are but do not care. Naturally, the last presents the biggest challenge. “Right now, everyone is competing to post faster and reach further, opting for sensationalism and taking shortcuts. There’s clickbait and ragebait, and sometimes this leads to people choosing to knowingly do the wrong thing.”
Unfortunately, there is little buy-in for most, as negativity tends to catch on and spread much quicker than positivity. “Good governance doesn’t really make headlines,”Mediha admits.
Still, there is more hope than one might think, as she believes the work done now is part of a continuous effort to sow the seeds of change for the future.
“The only way we can deal with this is through developing a mentality of collective responsibility for all of us, to call people out when they’re not doing things the right way,” she affirms. “To sustain this sort of quiet, thoughtful, consistent advocacy really takes time. We’re focusing more on the long-term, far-reaching impact of making people invested in doing the right thing without having to be compelled to do it.”
mm2.jpg

Even now, there are small but visible effects. The forum recently published guidelines for ethical reporting and sharing of suicide-related content, part of its response to the careless spreading of visuals and notes that was rampant during the pandemic.
“We have also done a lot of work on social media and on the ground specifically because this [topic] has nothing to do with censorship. It’s about compassion and making sure lives are being saved. We’ve seen the trend — broadcasters no longer show details. It’s trickled down also to, when irresponsible parties share, you see netizens commenting, like, ‘Take this down’, ‘You shouldn’t be showing this’. There’s a collective moral compass people are referring to these days,” she says.
Great change does takes time, Mediha acknowledges of the industry players’ gradual adaptation of new regulations, but every bit brings us as a society closer to true inclusion, diversity and safety.
“In the last revamp, we included accessibility for persons with disabilities. The standard is, at the very least, news and current affairs programmes should be accessible to everyone, though ideally it would be all content. Initially, there was pushback: For some, that’s seen as an investment, which you have to pay a little extra for. But universal accessibility is a global standard. Now you’ll notice most, if not all, mainstream broadcasters’ news content have accessibility features like sign language interpreters,” she points out.
Great responsibilites
What keeps her fighting the good fight? Well, Mediha shares with a laugh, “This sounds so corny, but it’s knowing that what we’re doing here does lead to positive change and has value.”
There is no predicting what may come tomorrow, but consolidating a core ethos of consideration and compassion is instrumental to the Content Forum’s mission statement.
“Whatever I’m doing might not have an impact you can see happening right now, but it helps to build a culture within this content ecosystem that will continue to be better. I sincerely believe we are all architects of this landscape, and whatever we post, share, engage with or put out there lays the land for us and our kids.”
Being in the loop when it comes to youth culture and the latest goings-on is another joyful part of the job, she adds. Logging on to games like Roblox (“At this ripe old age,” she jokes) or first-person shooters helps her stay apprised of “teen lingo” and what platforms younger generations are using, making it easier for her to relate to families she works with. It also allows her to unwind and bond with the kids. “I’ve got three boys, two of whom are teenagers, so I try to watch or play what they play to get a feel for it. That keeps me up to date — hopefully — and able to see what’s next, and that’s always fun.”
At home, she walks the talk too, implementing the safety advice she teaches other parents on her own children, including monitoring their devices and establishing clear rules, since self-regulation also extends to ensuring your kids are not being exposed to the wrong content. “You can’t have content targeted for families all the time. We always tell parents to check the classifications or categorisations of films. It is their responsibility to vet,” she reminds.
mm3.jpg

While many guardians are unaware of the harms that access to gaming or instant messaging can have on youths, interest is rising and the Content Forum endeavours to equip them with information through events like the Kids & Family Online Safety Workshop.
“The easiest way for us to lose public trust in content is if you’re pretending there isn’t a problem to talk about. There are so many difficult discussions to have with regard to content,” says Mediha.
Challenging topics like how online gender-based violence disproportionately affects women or the exposure of children to pornography are made even harder to confront in our conservative society, “but they are conversations we need to have”. This mother makes it a priority to talk to her own kids about issues like grooming or harassment in an age-appropriate way, as arming children with such awareness ultimately helps keep them out of danger.
In her experience, “The best way to have these dialogues is to make sure people understand we’re not here to judge or force anyone to do anything. We’re just here to lay the cards down: These are the risks you will be taking if you give your child unfettered access to connected devices. We really believe in debate and dialogue on these things because we want people to understand what we’re doing and why.”
True value
mm1.jpg

It is no secret that impressionable minds are especially vulnerable to harmful rhetoric, particularly from internet personalities peddling prejudiced ideologies, and can lead to youths exhibiting hateful and even criminal behaviours. Citing the recent case of a Johor teen who was charged for using AI to create and distribute lewd images of his schoolmates, Mediha stresses, “If children are taught to respect girls and bodily autonomy, and to avoid watching content that is misogynistic like Andrew Tate videos, then they are much less likely to do things like that. Technology will evolve, but the basic principles of human decency — honesty, integrity, fairness, accountability, dignity — remain the same.”
These are the essential properties that must be baked into the DNA of the Code, awarding those who adhere to it with the mindfulness and flexibility to tackle problems of tomorrow. “If people know from the very beginning something is wrong, no matter what tools they have in their hands, they will not do it. How people react to technology really depends on the values they have.”
Case in point: Many of mankind’s inventions garnered backlash and suspicion that we, with hindsight, now deem unwarranted. Detractors of AI are currently being accused of standing in the way of progress. How does one find the equilibrium between ethics and innovation?
For Mediha, our and lawmakers’ reactions to AI should take a page from our delayed reaction to regulating the internet. “We were excited about the internet when it was new, but we did not foresee the harms that would come from being hyperconnected. Today, we have AI. Innovation is fine, but we also need to look at how the same harms that came from the internet are going to be amplified much further with AI.
“Freedom ends where harm to others begins. Now is really the time for us, while we’re still excited about AI, to figure out how to prevent problems instead of waiting 20, 30 years for it to persist.”
Asked what she would like to see most in our ecosystem, Mediha’s plea is simple —“The understanding that whatever we put out there is going to outlive us”. Taking the time to be thoughtful and considerate with each engagement, big or small as it may be, is the first step to establishing positivity and longevity across media and the internet. “If we all collectively choose to do the right thing, consciously and with self-awareness, that would be great in the long run, and it’s what we’re trying to work towards. Wish us luck!”
This article first appeared on June 2, 2025 in The Edge Malaysia.