LONDON-Last year I became a cyborg. At the time it didn’t seem like an auspicious occasion, more a humbling and disorientating experience. But I’ve become excited about being part-robot.

My journey began when my hearing started to falter, due to a combination of unlucky genetics and too many late nights in loud clubs. By the time I was 30, I was losing scraps of conversation in crowded bars, and trips to the movies were nothing but booms and rumbles. Eventually, I relented and booked an appointment with an audiologist, who recommended I be fitted with hearing aids.

With that decision, I joined the millions of people whose mind, body or senses are replaced by technology, from wireless pacemakers to bionic legs. We live in the age of augmentation, and soon we may all choose to be enhanced in some way. After all, many prosthetic technologies do more than just fill in for our body or mind when it falls short -they now offer the potential to become “better than human”.

When I was fitted with hearing aids, I wondered: could I hack them to give me enhanced listening abilities? I explored this question in a BBC radio documentary this week, and discovered that the answer is far from simple. It turns out I don’t actually own my new ears in the way that I thought -and this raises important questions about many other augmenting technologies on the horizon, from retinal implants to bionic arms.

Unlike glasses, which simply focus the world through a lens, hearing aids take a very active role as an augment. They monitor the environment with their tiny microphones, constantly adjusting their output based on what they think is useful sound rather than noise. What I hear is their interpretation of the world around me.

This means these augments offer a very exciting possibility, because I don’t have to settle for hearing that is as good as an ordinary person. I could configure my own devices to extend my senses beyond normal abilities. I wouldn’t be first to hope for such a thing either: artist Neil Harbisson, for example, has built devices that let him hear what colour sounds like. In my case, connecting my hearing aids to an internet-linked tool such as a smartphone, any conceivable information can be streamed directly to my hearing aids. Unlike your eyes, which can only focus on a single object, your ears are purpose-built to absorb huge amounts of complex information at once. My ears could be aware of the entire world, everything from approaching bad weather to levels of internet traffic around me. With the right app, instead of being hard of hearing, I could be superhuman.

Unfortunately, supercharging my hearing aids is not just challenging, it’s positively forbidden. During one fitting, I asked the technician calibrating them how I could adjust the settings myself, in case I found them too loud or too quiet for a particular environment. “You can’t do that!” he exclaimed with some alarm. “It’s very important they are only set up by a qualified audiologist.”

He needn’t have worried too much. Hearing aids are, by design, incredibly resistant to tinkering. Some have a button to switch between modes for different environments. Others -like my current pair -are entirely automated, relegating me a passive listener rather than an engaged user.

Traditionally designed with elderly customers in mind, the emphasis for manufacturers has been on invisibility and ease of use, rather than fine control. All the same, manufacturers take a dim view of users fiddling with their own devices, and it’s very difficult for anyone who isn’t a certified audiologist to get their hands on the specialist programming equipment. Even the peripherals, such as additional microphones or Bluetooth adaptors, tend to come locked down in proprietary formats.