Author: Taylor Lowery

Taylor Lowery is a senior editor at glofiish.com, a technology writer, and a true circuit enthusiast. She works in the tech sector, so she does more than just cover it. Taylor works for a smartphone company during the day, which gives her a firsthand look at how gadgets are designed, manufactured, promoted, and ultimately placed in people's hands.Her writing is unique because of this insider viewpoint. Taylor makes the technical connections that other writers overlook, whether she's dissecting the silicon architecture of a new flagship chipset, analyzing the implications of a significant Android update for actual users, or tracking the effects of a new AI model announcement across the mobile industry.Her editorial focus covers every aspect of the current tech stack, including smartphone software and hardware, artificial intelligence (from large language models and generative tools to on-device inference), and the broader innovation trends influencing the direction of the consumer technology sector. She is especially passionate about the nexus of AI and mobile computing, which she feels is still in its most exciting early stages.

Artificial intelligence in the modern era typically involves enormous machines. Somewhere in northern Sweden or Nevada, windowless data centers are humming. GPU racks are consuming power like a tiny metropolis. It’s a striking picture, but it seems strangely disconnected from the biological brain that quietly resides inside a human skull. Some neuroscientists appear to be troubled by that contrast. Benjamin Cowley, an assistant professor at Cold Spring Harbor Laboratory, and partners from Princeton University and Carnegie Mellon University have adopted a different approach. Rather than constructing ever-larger AI systems, they attempted to compress one until it started to resemble a…

Read More

Outside a data center in Northern California on a gloomy morning, the structure appears oddly normal. walls that are beige. No windows. A peaceful parking area. But inside, cooling fans force air through silicon racks intended to satisfy the world’s expanding demand for artificial intelligence, while rows of servers hum like an industrial orchestra. Investors are starting to grasp the narrative that this quiet infrastructure conveys. The AI narrative has been dominated for the majority of the last two years by impressive software, such as chatbots that write code, create images, and compose essays. It’s the technology’s poetry. However, something…

Read More

People typically react with disbelief when they see a biometric shirt in use for the first time. With its thin athletic fabric and breathable seams, it appears to be an ordinary item that you might pick up before a morning run. However, sensors that silently record each breath, heartbeat, and step are concealed within the weave. It feels oddly personal to watch the data emerge in real time, as if the clothing has learned to pay attention to the body. For years, businesses such as Hexoskin have been testing this concept. Professional athletes and even the Canadian Space Agency use…

Read More

For years, passwords have been obsolete. The odd thing is that most people are still unaware of it. IT departments secretly battle them every day in offices all over the world. Reset requests accumulate. On monitors, sticky notes appear. Workers use the same eight-character secret for grocery delivery services, corporate systems, and banking apps. Observing this routine gives the impression that passwords have endured primarily due to habit rather than intelligence. In the background, something else has been developing. The Connectivity Standards Alliance’s early 2026 release of the Aliro standard seems like a watershed. Engineers put it simply as “Matter…

Read More

The silence inside a vertical farm is the first thing visitors notice. There is no wind. There is no dirt underfoot. Just rows of lettuce, arranged floor to ceiling like plant shelves in a library, softly glowing under pink LED lights. It’s difficult to avoid feeling a little skeptical when you’re in one of these facilities. Food is growing inside a structure that was formerly used to store auto parts. After all, it was never intended for cities to produce their own food. Urban life has relied on far-off farmland for the majority of modern history. trucks transporting California spinach.…

Read More

It felt strangely quiet when the Pixel Tablet was first plugged into a monitor and started acting like a tiny desktop computer. No big announcement. No product reveal in a theatrical setting. All it takes is a cable that slides into a USB-C port, a cursor that appears on a big screen, and the Android interface that reorganizes itself into something oddly familiar—windows, taskbar, mouse pointer. It’s difficult to ignore what that moment implies. Google appeared content to keep its worlds apart for years. Android was used on tablets. Chrome OS was used on laptops. Each had a unique hardware…

Read More

A calm academic routine abruptly veered toward something unfamiliar on a fall afternoon in October 2023. An email with a blurry picture of what appeared to be a scrap of burned paper was sent to Federica Nicolardi, a papyrologist who studies ancient texts. It was not paper at all. When Mount Vesuvius buried the Roman town of Herculaneum beneath volcanic debris almost two millennia ago, the papyrus scroll was burned. Since the year 79, the scroll had lain silent. Scholars had attempted to unlock these scrolls for centuries, typically with disastrous outcomes. The papyrus had a tendency to crumble like…

Read More

Neuromorphic computing is still relatively new outside of the research facilities in California and Zurich. No grandiose product introductions. There are no throngs of people waiting outside electronics shops. Just quiet labs with engineers staring at wafer samples and oscilloscopes, attempting to make silicon behave a bit more like biology. However, researchers are beginning to suspect that something strange is going on. Artificial intelligence has become more potent—sometimes in a startling way—but it is also growing hungry. GPU-filled data centers run nonstop, consuming electricity at a rate that is, to be honest, unsustainable. As this develops, it’s difficult to ignore…

Read More
All

An elderly man is sitting on a recliner in a suburban living room on a calm afternoon. He is holding a smartphone a little further away from his face than most people would. After a brief moment of squinting at the screen, he taps twice with two fingers. Abruptly, the phone starts talking, reading the words that had just appeared on the screen in a composed manner. That may seem odd to a lot of younger users. For him, it makes the difference between continuing to use a smartphone and completely giving it up. This brief exchange raises the possibility…

Read More
All

The language on some private Facebook groups and Telegram channels becomes strangely technical late at night. Sellers promote “verified labeling accounts,” picture archives, and occasionally even whole datasets that were gathered years ago and then forgotten. The posts may appear to be ordinary internet spam to someone who is not involved in the artificial intelligence sector. However, they allude to something more peculiar within the AI economy. A quiet marketplace for training data. Similar to how factories used to run on coal, the current AI boom is powered by information. Massive amounts of examples are needed for voice assistants, large…

Read More