Kamis, 09 April 2026

FOSS Weekly #26.15: Rollback in apt, bad USB detection, Glass UI in KDE, Linux Kernel dropping older processor support and more

Linus Torvalds created two of the most widely used tools in modern computing: the Linux kernel and Git.

Git, of course, is a version control system primarily used by programmers.

But Theena makes a strong case that Git and plain text are the best tools a writer can use. Not just for backup but for building a writing practice that is truly their own..

At its core, the argument is about breaking free from platform dependency, long-term preservation, and treating your body of work as something worth designing around rather than just storing somewhere convenient.

Here are other highlights of this edition of FOSS Weekly:

  • sudo tips and tweaks.
  • Apt's new version has useful features.
  • Opera GX arriving as a gaming browser for Linux.
  • A Linux driver proposal to catch malicious USB devices.
  • And other Linux news, tips, and, of course, memes!

Tired of AI fluff and misinformation in your Google feed? Get real, trusted Linux content. Add It’s FOSS as your preferred source and see our reliable Linux and open-source stories highlighted in your Discover feed and search results.

Add It's FOSS as preferred source on Google (if you use it)

๐Ÿ“ฐ Linux and Open Source News

Not open source software but Opera GX, the gaming-focused Chromium browser that's been on Windows and macOS for years, has finally landed on Linux. Sourav took the early access build for a spin and tested the features it's known for, like GX Control for capping RAM and CPU usage while gaming and GX Cleaner for cleaning up junk data.

The Linux kernel is finally dropping i486 support, queued for Linux 7.1. The first patch removes the relevant Kconfig build options, with a fuller cleanup covering 80 files and over 14,000 lines of legacy code still to follow.

Proton has launched two new things: Proton Workspace, a bundled suite of all their services aimed at businesses looking for a privacy-first alternative to Google Workspace or Microsoft 365, and Proton Meet, an end-to-end encrypted video conferencing tool using the open source MLS protocol.

A proposal has been submitted to the Linux kernel mailing list for a new HID driver called hid-omg-detect that passively monitors USB keyboard-like devices for suspicious behavior.

Another proposal, but for Fedora was recently struck down. It looked to move per-user environment variable management from shell RC files into systemd.

Remember the glass UI from the Windows 7 era? KDE is considering bringing back the older classic Oxygen and Air themes. These themes will be optional, of course.

Anthropic, the company behind Claude AI, has donated $1.5 million to Apache Software Foundation. The donation aims to secure the open source stack AI tools depend on.

๐Ÿง  What We’re Thinking About

Firefox has been losing ground for a decade, and Mozilla is trying something new. A built-in VPN and a growing set of AI features. Roland's piece looks at whether either of those things is likely to actually work.

Puter, the open source browser-based desktop OS, has added ONLYOFFICE to its app marketplace, giving it a full office suite covering documents, spreadsheets, presentations, and PDF editing.

YOUR support keeps us going, keeps us resisting the established media and big tech, keeps us independent. And it costs less than a McDonald's Happy Meal a month.

Support us via Plus membership and additionally, you:

✅ Get 5 FREE eBooks on Linux, Docker and Bash
✅ Enjoy an ad-free reading experience
✅ Flaunt badges in the comment section and forum
✅ Help creation of educational Linux materials for everyone

Join It's FOSS Plus

๐Ÿงฎ Linux Tips, Tutorials, and Learnings

Not many people know that sudo command's behavior can be tweaked as well. Here are a few sudo tweaks.

Tennis is a Zig-written terminal tool that renders CSV files as clean, color-coded tables with solid borders and auto-detected themes.

APT package manager's latest version 3.2 has a rollback feature. Sourav briefly tested it.

๐Ÿ“š Linux eBook bundle (don't miss)

No Starch Press needs no introduction. They have published some of the best books on Linux. And they are running an ebook bundle deal on Humble Bundle.

I highly recommend checking it out and getting the bundle.

Plus, part of your purchase supports Electronic Frontier Foundation (EFF).

๐Ÿ‘ท AI, Homelab and Hardware Corner

The Linux kernel dropped i486 support and added GD-ROM driver support for the Sega Dreamcast in the same breath.

✨ Apps and Projects Highlights

Hideout is a minimal GTK4/Adwaita desktop app for file encryption and decryption, powered by GnuPG.

๐Ÿ“ฝ️ Videos for You

Here are some Linux terminal tricks to save you time.

๐Ÿ’ก Quick Handy Tip

You can copy a file in Nautilus by pressing Ctrl+C, then press Ctrl+M to paste it as a symbolic link instead of an actual copy. This is a handy way to create a symlink without ever needing to open a terminal!

0:00
/0:15

๐ŸŽ‹ Fun in the FOSSverse

In this members-only crossword, you will have to name systemd's ctl commands.

An appropriate meme on the OS-level age verification topic.

age verification and linux distro maintainers meme

๐Ÿ—“️ Tech Trivia: On April 8, 1991, a small team at Sun Microsystems quietly relocated to work in secret on a project codenamed "Oak", a programming language that would eventually be renamed Java and go on to become one of the most widely used languages in the world, powering everything from Android apps to enterprise software.

๐Ÿง‘‍๐Ÿค‍๐Ÿง‘ From the Community: A FOSSer is wondering if anyone has ever jailbroken a Kindle for KOReader use.



from It's FOSS https://ift.tt/gKGkRWh
via IFTTT

I Tried Apt Command's New Rollback Feature — Here’s How It Went

APT, or Advanced Package Tool, is the package manager on Debian and its derivatives like Ubuntu, Linux Mint, and elementary OS. On these, if you want to install something, remove it, or update the whole system, you do it via APT.

It has been around for decades, and if you are on a Debian-based distro, then you have almost certainly used it without giving it much thought. That said, it has seen active development in the last couple of years.

We covered the APT 3.0 release this time last year, which kicked off the 3.x series with a colorful new output format, the Solver3 dependency resolver, a switch from GnuTLS/GnuPG to OpenSSL, and Sequoia for cryptographic operations.

The 3.1.x cycle that followed has now closed out with APT 3.2 as the stable release, and it brings some notable changes with it.

What do you get with Apt 3.2?

a terminal window that shows the output to apt --help, we have the version numbers, a brief description of apt, and a list of the most used apt commands

The biggest additions with this release are transaction history with rollback support, some new commands, and per-repository package filtering.

APT now keeps a log of every package install, upgrade, and removal. You can view the full list with apt history-list, which shows all past operations with an ID assigned to each. To see exactly what packages were affected in a specific operation, you can use apt history-info <ID>.

From there, apt history-undo <ID> can be used to reverse a specific operation, reinstalling removed packages or removing installed ones as needed. If you undo something mistakenly and want it back, run apt history-redo <ID> to reapply it.

For cases where you want to revert everything back to the state at a particular point, apt history-rollback <ID> does that by undoing all operations that happened after the specified ID. Use this with care, as it makes a permanent change.

apt why and apt why-not are another set of new additions that let you trace the dependency chain behind a package. Run apt why <package> and APT will tell you exactly what pulled it onto your system. Run apt why-not <package> and it will tell you why it is not installed.

Similarly, Include and Exclude are two new options that let you limit which packages APT uses from a specific repository. Include restricts a repo to only the packages you specify, and Exclude removes specific packages from a repo entirely.

Solver3, which shipped as opt-in with APT 3.0, is now on by default. It also gains the ability to upgrade packages by source package, so all binaries from the same source are upgraded together.

Additionally, your system will no longer go to sleep while dpkg is running mid-install and JSONL performance counter logging is also in, though that is mostly useful for developers.

If all of that's got you interested, then you can try Apt 3.2 on a Debian Sid installation as I did below or wait for the Ubuntu 26.04 LTS release, which is reportedly shipping it.

How to use rollback on Apt?

I almost got lost in the labyrinth of Vim, unable to exit.

After installing some new programs using APT, I tested a few commands to see how rollback and redoing transactions worked. First, I ran sudo apt history-list in the terminal and entered my password to authorize the command.

The output was a list of APT transactions that included the preparatory work I had done to switch to Debian Sid from Stable, as well as the two install commands to get Vim and Nala installed.

Next, I ran sudo apt history-info 4, the number being the ID of the transaction, and I was shown all the key details related to it, such as the start/end time, requested by which user, the command used, and packages changed.

After that, I ran sudo apt history-undo 4 to revert the Vim installation and sudo apt history-redo 4 to restore the installation; both of these commands worked as advertised.

Finally, I tested sudo apt history-rollback 3 to get rid of Nala, and the process was just about the same as before, with me being asked to confirm changes by typing "Y".

When I tried to run apt history-redo for this one, the execution failed as expected.


๐Ÿ’ฌ Do these new additions look useful to you? Can't be bothered? Let me know below!



from It's FOSS https://ift.tt/W1XaUkc
via IFTTT

Selasa, 07 April 2026

The Linux Kernel is Finally Letting Go of i486 CPU Support

Plenty of CPU architectures have come and gone over the last few decades. The x86 family alone has seen a long line of chips rise to prominence and fade away as newer generations took over.

The i486 is one such chip, and it has been holding on in the Linux kernel far longer than most people expected. It was launched in 1989 as Intel's answer to what came next after the i386.

It was faster, smarter, and arrived right as personal computers were making their way from offices into living rooms. For many people, a 486-powered PC was their first computer.

By the early 1990s, the chip was everywhere. It was so dominant that AMD, Cyrix, and IBM all jumped in with their own compatible versions to grab a slice of the market. Intel kept producing the i486 well past its prime too, with embedded versions rolling off the line until 2007.

Most major platforms dropped i486 support a long time ago. Microsoft's last operating systems to officially support it were Windows 98 and Windows NT 4.0. The Linux kernel, however, has kept the lights on for i486 users well into the 2020s.

But that is now changing. ๐Ÿ˜…

What's happening?

Back in April 2025, kernel maintainer Ingo Molnรกr posted an RFC patch series to the Linux Kernel Mailing List, proposing to raise the minimum supported x86-32 CPU. The new floor would require chips with both a Time Stamp Counter (TSC) and CMPXCHG8B (CX8) instruction support.

Anything short of that, including the i486 and some early Pentium variants, would be out.

Prior to that, Linus Torvalds had already made his position clear on the mailing list, saying that:

I really get the feeling that it's time to leave i486 support behind. There's zero real reason for anybody to waste one second of development effort on this kind of issue.

Ingo's RFC had covered a fair amount of ground. The full cleanup would touch 80 files and remove over 14,000 lines of legacy code, including the entire math-emu software floating-point emulation library.

Now, the first of those patches removes the CONFIG_M486, CONFIG_M486SX, and CONFIG_MELAN Kconfig build options. It has been committed and is queued for Linux 7.1. Once it lands, building a Linux kernel image for i486-class hardware will no longer be possible.

Ingo noted in the commit that no mainstream x86 32-bit distribution has shipped an M486=y kernel package in some time, so the real-world impact on active users should be close to zero.

Unsupported but not unusable

If you have an i486 machine tucked away somewhere, it is not suddenly useless. Older kernel releases will continue to run on the hardware just fine.

Yes, those older kernels are not getting security patches. But if you are keeping a decades-old machine around for historical or educational purposes, it will not be your daily driver.

Just keep it off the internet, pair it with an older LTS kernel, and it will do what you need it to do without much fuss.



from It's FOSS https://ift.tt/nozeflx
via IFTTT

Senin, 06 April 2026

I Found A Terminal Tool That Makes CSV Files Look Stunning

You can totally read CSV files in the terminal. After all, it's a text file. You can use cat and then parse it with the column command.

Displaying csv file in table format with cat and column commands
Usual way: Displaying csv file in tabular format with cat and column commands

That works. No doubt. But it is hard to scan and certainly not easy to follow.

I came across a tool that made CSV files look surprisingly beautiful in the terminal.

Default view of a CSV file with Tennis
New way: Beautiful colors, table headers and borders

That looks gorgeous, isn't it? That is the magic of Tennis. No, not the sport, but a terminal tool I recently discovered.

Meet Tennis: CSV file viewing for terminal junkies

Okay... cheesy heading but clearly these kinds of tools are more suitable for people who spend considerable time in the terminal. Normal people would just use an office tool or simple text editor for viewing CSV file.

But a terminal dweller would prefer something that doesn't force him to come out of the terminal.

Tennis does that. Written in Zig, displays the CSV files gorgeously in a tabular way, with options for a lot of customization and stylization.

Screenshot shared on Tennis GitHub repo
Screenshot shared on Tennis GitHub repo

You don't necessarily need to customize it, as it automatically picks nice colors to match the terminal. As you can see, clean, solid borders and playful colors are visible right upfront.

๐Ÿ“‹
As you can see in the GitHub repo of Tennis, Claude is mentioned as a contributor. Clearly, the developer has used AI assistance in creating this tool.

Things you can do with Tennis

Let me show you various styling options available in this tool.

Row numbering

You can enable the numbering of rows on Tennis using a simple -n flag at the end of the command:

tennis samplecsv.csv -n
Numbered Tennis CSV file

This can be useful when dealing with larger files, or files where the order becomes relevant.

Adding a title

You can add a title to the printed CSV file on the terminal, with a -t argument, followed by a string that is the title itself:

tennis samplecsv.csv -t "Personal List of Historically Significant Songs"
CSV file with added title

The title is displayed in an extra row on top. Simple enough.

Table width

You can set a maximum width to the entire table (useful if you want the CSV file not to occupy the entire width of the window). To do so, use the -w tag, followed by an integer that will display the maximum number of characters that you want the table to occupy.

tennis samplecsv.csv -w 60
Displaying a CSV file with a maximum table width

As you can see, compared to the previous images, this table has shrunk much more. The width of the table is now 60 characters, no more.

Changing the delimiter

The default character that separates values in a CSV file is (obviously) a comma. But sometimes that isn't the case with your file, and it could be another character like a semicolon or a $, it could pretty much be anything as long as the number of columns is the same for every row present. To print a CSV file with a "+" for a delimiter instead, the command would be:

tennis samplecsv.csv -d +
Tennis for CSV file for a different delimiter

As you can see, the change of the delimiter can be well specified and incorporated into the command.

Color modes

By default, as mentioned in the GitHub page, Tennis likes to be colorful. But you can change that, depending on the --color flag. It can be on, off or auto (which mostly means on).

tennis samplecsv.csv --color off
Tennis print with colors off

Here's what it looks like with the colors turned off.

Digits after decimal

Sometimes CSV files involve numbers that are long floats, being high precision with a lot of digits after a decimal point. While printing it out, if you don't wish to see all of them, but only to a certain extent, you use the --digits flag:

tennis samplecsv.csv --digits 3
CSV file with number of digits after decimal limited

As you can see on the CSV file printed with cat, the rating numbers have a lot of digits after the decimal points, all more than 3. But specifying the numbers caused Tennis to shorten it down.

Themes

Tennis usually picks the theme from the colors being used in the terminal to gauge if it is a dark or a light theme, but you can change that manually with the --theme flag. Since I have already been using the dark theme, let's see what the light theme looks like:

Tennis light theme

Doesn't look like much at all in a terminal with the dark theme, which means it is indeed working! The accepted values are dark, light and auto (which again, gauges the theme based on your terminal colors).

Vanilla mode

In the vanilla mode, any sort of numerical formatting is abolished entirely from the printing of the CSV file. As you can see in the images above, rather annoyingly, the year appears with a comma after the first number because the CSV file is wrongly assuming that that is a common sort of number and not a year. But if I do it with the --vanilla flag:

tennis samplecsv.csv --vanilla
Tennis usage with numerical formatting off

The numerical formatting of the last row is turned off. This will work similarly with any other sort of numbers you might have in your CSV file.

Quick commands (you are more likley to use)

Here's the most frequently used options I found with Tennis:

tennis file.csv # basic view
tennis file.csv -n # row numbers
tennis file.csv -t "Title"
tennis file.csv -w 60
tennis file.csv --color off

I tried it on a large file

To check how Tennis handles larger files, I tried it on a CSV file with 10,000 rows. There was no stutter or long gap to process the command, which will obviously vary from system to system, but it doesn't seem like there is much of a hiccup in the way of its effectiveness even for larger files.

That's just my experience. You are free to explore on your system.

Not everything worked as expected

๐Ÿšง
Not all the features listed on the GitHub page work.

While Tennis looks impressive, not everything works as advertised yet.

Some features listed on GitHub simply didn’t work in my testing, even after trying multiple installation methods.

For example, there is a --peek flag, which is supposed to give an overview of the entire file, with the size, shape and other stats. A --zebra flag is supposed to give it an extra layer of alternated themed coloring. There are --reverse and --shuffle flags to change the order of rows, and --head and --tail flags to print the only first few or last few rows respectively. There are still more, but again, unfortunately, they do not work.

Getting started with Tennis

Tennis can be installed in three different ways, one is to build from source (obviously), second to download the executable and place it in one of the directories in your PATH (which is the easiest one), and lastly using the brew command (which can indeed be easier if you have homebrew installed on your system).

The instructions for all are listed here. I suggest getting the tar.gz file from the release page, extracting it and then using the provided executable in the extracted folder.

There is no Flatpak or Snap or other packages available for now.

Final thoughts

While the features listed in the help page work really well, all the features listed on the website do not, and that discrepancy is a little disappointing, but something that we hope gets fixed in the future.

So altogether, it is a good tool for printing your CSV files in an engaging way, to make them more pleasing to look at.

While a terminal lover find such tools attractive, it could also be helpful in cases where you are reviewing exported data from a script or you have to deal with csv files on servers.

If you try Tennis, don't forget to share the experience in the comment section.



from It's FOSS https://ift.tt/7uOWECv
via IFTTT

A New Linux Kernel Driver Wants to Catch Malicious USB Devices in the Act

A patch has been submitted to the Linux kernel mailing list proposing a new HID driver that would passively monitor USB keyboard-like devices and flag the ones that look like they're up to no good.

The driver is called hid-omg-detect, and it was proposed by Zubeyr Almaho.

The way it works is fairly clever. Rather than blocking anything outright, the module sits quietly in the background and scores incoming HID devices based on three signals.

Keystroke timing entropy, plug-and-type latency, and USB descriptor fingerprinting. The idea here is that a real human typing on a real keyboard behaves very differently from a device that was purpose-built to inject keystrokes the moment it's plugged in.

If a device's score crosses a configured threshold, the module fires off a kernel warning and points toward USBGuard as a userspace tool to actually do the blocking. Zubeyr adds that the driver itself does not interfere with, delay, or modify any HID input events.

This is already the second revision of the patch. The first pass got feedback on things like global state management and logging inside spinlock-held regions, all of which have been addressed in v2.

Is there a real threat?

The short answer is yes. The proposal explicitly calls out two threats, BadUSB and O.MG; both are worth knowing about.

BadUSB is the broader class of attack that was first disclosed back in 2014 by security researchers. It works by reprogramming the firmware on a USB device to impersonate a keyboard.

The operating system sees it as a perfectly normal input device, trusts it completely, and lets it do whatever its payload tells it to, be it open terminals, download malware, or exfiltrate data.

The O.MG Cable takes the same idea and hides it inside something that looks exactly like a regular USB cable. There's a tiny implant built into the connector that can inject keystrokes, log them, spoof USB identifiers to dodge detection, and be controlled remotely over WiFi.

Neither of these are making the headlines as often as they once did, but that doesn't mean the threat has gone away. Such tools have only gotten more refined and accessible, and malicious actors in 2026 are not getting any less creative or aggressive.

However, there's a big 'but' (not that you pervert) here. This is only a proposal, and while it looks good on the surface, the kernel maintainers have the final say in whether this makes it into Linux.

Via: Phoronix



from It's FOSS https://ift.tt/Qs0bJjn
via IFTTT

Sabtu, 04 April 2026

Can Free VPN and AI Save Firefox From Decline?

It's no secret that Firefox has been steadily losing ground over the past decade or so. Despite efforts to revitalize this once beloved titan of the internet, the market share just hasn't returned, and Mozilla's recent choices haven't been helping the cause. That being said, Mozilla hasn't given up, and after many false starts, it seems like current leadership is ready to give it a go at regaining ground.

The recently introduced built-in Firefox VPN feature is an example of this, as are the (admittedly controversial) AI-powered enhancements recently shipped in recent releases. But are these enough to give Firefox a real chance to claw its way back to the top, or at least make it relevant enough to survive?

Let's talk about it, and see where things might be headed for our favourite red panda.

Is Firefox really dying?

A screenshot of the latest browser stats from Statcounter
Firefox isn't faring so well on stat counter in recent years

Since we’re asking whether Firefox can be resurrected, it shouldn’t come as a shock that, by the numbers, Firefox is not in a particularly good place. Since the launch of Google Chrome, Firefox has gradually, and then more rapidly, fallen from its former position to the point where it now accounts for just 2.29% of global browser market share, according to Statcounter. That’s down from 7.97% in 2016 (which is still quite minimal), a drop of roughly 5.7 percentage points in the last decade alone.

Of course, a low market share does not mean an open-source project is literally “dying”. But Firefox is not just a project. It is also a product, and as a product, it has an incentive not just to exist or survive, but to thrive. Right now, the long-term trend suggests it is doing neither especially well.

What happened to Firefox's popularity anyway?

A screenshot of the about dialog from Firefox 149
Firefox is still getting regular release despite the falling market share

It’s easy to snigger and say “Chrome happened, heh!” but that wouldn’t do the whole story justice. It’s unfair to say that the resignation of former Mozilla CEO Brendan Eich in 2014 and the subsequent creation of Brave is responsible for Firefox’s decline, even if that episode is sometimes cited as one more nail in the Firefox coffin.

Instead, the reality is a bit more complicated, and it’s worth paying attention to before we answer the questions posed by our overall premise.

For starters, Firefox has reinvented itself a bit too often in a relatively short timeframe, and unfortunately, these reinventions have at times blindsided loyal users. From Australis to Quantum/Photon, and later Proton, Mozilla has seemed to be in a relentless search for a new Firefox aesthetic. On the surface, no pun intended, this may not seem like a big deal, because after all, “a UI is just another coat of paint”, right?

๐Ÿ’ก
Did you know? Firefox is gearing up for yet another interface change. You can learn all about it in our coverage on Firefox Nova.

The problem with change is friction

A photo of a person stuck under a bunch of boxes
Too many changes in a short time can leave users feeling overwhelmed Pexels / cottonbro studio

Every change is another experience for users to get used to, and adjusting to change brings friction. The more change, the more friction, and the more friction the greater the frustration. Eventually, users get tired and move on.

By contrast, Chrome and most of Firefox’s major competitors have remained comparatively stable in their core look and feel over time, which reduces the friction users feel when moving from one version to the next. Furthermore, Firefox lost its legacy extension system and full browser theming in 2017, and before that, the standout Panorama tab groups feature in 2016. You can see the Firefox 57 transition point in Mozilla’s own release notes.

Simply put, Firefox suffers from a war of its own attrition. So the question then becomes can its new features heal the scars the old wounds left behind?

Why the new VPN matters, if they get it right

Mozilla VPN

Of all the moves Mozilla has been making in Firefox recently, this one perhaps has the greatest potential to be the sleeper hit Firefox has needed for a long time. After all, Mozilla has long positioned itself as a champion of privacy and security, and Firefox still retains a stronger reputation for privacy than many of its mainstream rivals.

Unlike AI features, which many users may ignore, distrust, or actively avoid, built-in privacy tools solve a problem people already understand.

That said, Mozilla needs to be careful not to make some of the same obvious mistakes that have hurt other browsers in the past. Just as importantly, it needs to resist the temptation to keep this feature restricted to only a select few in the long run.

Don’t give us a glorified proxy

A screenshot from the Opera VPN page
Opera VPN has come under fire in the past for not being a true VPN service

Opera tried this, and to my knowledge, it is still essentially that, despite carrying the name of a VPN. If Mozilla is serious about this effort, then it needs to make sure that what it is calling a VPN actually delivers on what the term implies.

If this is going to matter, it cannot feel like a half-step, a marketing hook, or a dressed-up proxy with a more fashionable label. It needs to be useful, absolutely trustworthy (a very hard sell), and accessible enough that ordinary users can feel the benefit without having to decode the fine print first.

It needs to be for everyone, or it shouldn’t exist at all

A silhouette of 5 person posing in front of a sunset sky while standing on what appears to be a hill
Pexels / Olha Ruskykh

That stance may sound a little hardline, but it is the stance Firefox needs if Mozilla truly intends to make this feature matter on the global stage. A privacy feature cannot meaningfully strengthen Firefox’s position if large parts of the world are excluded from using it.

The world is not limited to the US, UK, Europe, and Canada. It never was. If Mozilla is going to introduce a feature like this, it needs to be available worldwide, or it risks sending the message that a large subset of highly connected users, many of whom also contribute to the open-source technologies that make these features possible, do not matter enough to be included. Mozilla, of all companies, needs to prove that this is not its position.

AI: Not for everyone, but maybe enough for some

A screenshot of the AI Controls in Firefox preferences
The AI settings in Firefox Preferences show Mozilla is leaning heavily towards local solutions

It's important to understand the approach Mozilla is taking here, since this is an area where things often get framed through sensationalism rather than reality. Yes, Mozilla is adding AI features to Firefox, and at a fairly brisk pace. However, these features are still optional, though Mozilla choosing to make them opt-out rather than opt-in might leave a bad taste in some users' mouths. Mozilla’s current AI controls are part of that wider balancing act.

That being said, some users not only won't mind these features, but may sincerely expect them to be present in any modern browser, and be disappointed without them. After all, there's a very real market for the likes of Microsoft's Copilot and Google's Gemini: casual users who aren't too deeply concerned how something works so much as whether they can use it or not.

Striking the balance

A screenshot of the marketing for Firefox, showing the line "Control without complexity" and a number of images and associated points
Mozilla is trying to market Firefox with a more balanced approach, but will it work?

The key here isn't so much about whether Mozilla/Firefox should abandon AI altogether. It's clearly a direction Mozilla is dead set on exploring, even as privacy concerns continue to dominate the conversation. The real trick is to find a way for these features to exist while also doing something genuinely useful.

Poor article summaries and gimmicky integrations are just not going to win many people over, certainly not in the long run. But on-device tools that provide translations, help users conduct better research, navigate their browsing history more intelligently, or just generally get real work done faster without sending their data off into the void? Now that's a story most people can confidently get behind.

That's where Mozilla may have a real opening. Sure, AI isn't likely to be the thing that single-handedly "saves" Firefox, even if done "right". Yet, if it's handled carefully, it could help Firefox feel current, capable, and competitive to the kinds of users who now expect these conveniences to exist.

Counterpoint: What about the competition? Is everyone doing it?

A screenshot of Vivaldi showing the "keep browsing human" announcement post
Vivaldi is known for its bells and whistles. AI isn't one of them

No, and if we're looking at benchmarks of success, this really matters. For example, Vivaldi, the "spiritual successor" to the pre-Chromium-clone Opera, has firmly chosen not to integrate generative AI features into the browser. They've been quite explicit about this stance with their "keep browsing human" messaging.

In a world where it seems every major browser vendor is diving in head-first, this is a bold decision that helps Vivaldi stand apart from a market increasingly saturated by the same talking points and "checklist features" that feel like mere buzzword copycatting. This is also one of the reasons why Firefox forks like Waterfox and others have continued to hold solid, faithful communities.

Truthfully, Firefox has often been chosen because it's not like the crowd: it's not Chrome, it's not a clone (it still uses its own Gecko engine), and it's the one major browser that has historically dared to remain not only independent but substantively different. So while some users won't mind a little assistance here and there, the Firefox faithful may be more likely to be the ones turned off by the "AI everywhere" trend that's taken over the internet. For those users, restraint can be a selling point in itself.

What this means for Firefox

A screenshot from Firefox.com showing "Fast to switch. Easy to settle in."
Mozilla is clearly trying to keep the Firefox brand relevant and alive. Will these new efforts be enough?

What Mozilla is pursuing here is still quite the gamble. They're playing the fine line between the privacy-focused legacy of Firefox and the "assisted future" that the world is headed towards. It may look like the right way forward for some, but might very well be a death knell to others.

Mozilla may believe in striking a balance by keeping these features flexible, optional, and in some cases locally driven. The problem is that balance is hard to achieve, and even harder to effectively communicate.

So Firefox's real challenge isn't just adding new features. It's in convincing people that it still knows where to draw the line. If Mozilla gets that balance right, Firefox may come across as modern without feeling overstuffed. If they get it wrong, it risks alienating users who just wanted a browser with boundaries.

The secret benefit of drawing attention

A photo of a loudspeaker with an orange base, white hand, and white flange with a silver rim, sitting on a lightly coloured stool
"AI", "privacy", and "VPN" sure are great ways to stir up conversation, if this is the aim Pexels / Mikhail Nilov

It would be remiss of me to close out without addressing the one thing that this new strategy by Mozilla may be most succeeding at: getting us to talk about Firefox again. Sure, not all the talk around Mozilla's recent decisions has been positive, and if we're being fair, they have given us some reasons for pause. However, if there's one thing attention does well, it's getting people to see what all the fuss is about, even if they're otherwise not sold or even all that interested.

Maybe that's what Mozilla is angling for with Firefox after all - and if they can manage to stick the landing, all this increased attention and coverage might just be the key to getting new (and old) users to try this new flavour of Firefox ice cream and find that we like it.

Is it all enough?

A screenshot from firefox.com showing more of the new branding for Firefox
Will the new features keep up with the ambitious branding and fresh energy?

Frankly, it's a bit too early to tell, though the reality is that trends can often be shifted by the most unexpected winds of change. No one expected Chromebooks to become a success, until they were. At one time, no one saw smartphones coming, now they're everywhere. What drove those trends? Tiny, seemingly innocuous factors, and simple, seemingly unimportant features. The same can happen with Firefox and its ambitions to recapture its position in the hearts and minds of users around the world. Could the new VPN and more, but cleverly handled AI integration be the secret sauce to push things over the line?

Only time will tell, but maybe, there's a chance this time.



from It's FOSS https://ift.tt/i81Zoge
via IFTTT

Git Isn’t Just for Developers. It Might Be the Best Writing Tool Ever

In 2019, I watched a fellow writer almost lose her life’s work.

We were working in an advertising agency. Like most writers who end up in advertising, we were both secretly working on our novels. One afternoon, after lunch, I noticed her pacing around the office, rifling through her bag, checking every desk. Her irritation quickly turned into panic.

Her pen drive was missing.

Hours later, on the verge of tears, she told us why this particular pen drive mattered: it held the only copy of her manuscript.

My first reaction was disbelief. Only copy?

No emailed draft to herself, no Google Drive or Dropbox, no backup anywhere? The answer was simple: she hadn’t thought about it. Relative tech illiteracy had put an entire novel at the mercy of a misplaced USB stick.

My reaction was part heartbreak, part annoyance, and part dread. That night I sat down to audit my own practice—how I recorded, recalled, and stored my work.

At the time, the source of truth for my fiction was a single folder on Dropbox, with dozens of subdirectories by project. All the manuscripts were ‎.doc or ‎.docx. I took regular backups of that folder, zipped them, and emailed them to myself with dates and times in the subject line. If something went wrong, I could theoretically roll back to a recent version.

On paper, that sounded reasonable. In my body, it felt wrong. I couldn’t articulate why, but I knew “not losing everything” was not the same as “leaving behind a studio that someone else could actually use.”

A few weeks later, on a whim, I decided to relearn programming after almost twenty years. Maybe, I thought, programming in 2019 would be kinder than it had been in 2001.

The first lesson on The Odin Project was on Git.

I went through it expecting boilerplate developer lore and came out with something else: a way to resolve the unease I had been carrying about my writing. Git didn’t just promise safety from catastrophic loss; it offered a way to keep a living, navigable history of my writing. It suggested that my studio didn’t have to be a pile of files.

It could be a time machine instead.

I remember feeling irritated that night: why was Git not being taught to writers?

The Timelessness of Plain Text

Sociologist Kieran Healy wrote a guide for “plain people” on using plain text to produce serious work. Neither he nor I are the first non‑programmers to come to this realization, and hopefully not the last: plain text is the least glamorous, most important infrastructure upon which I build my work. I use the word infrastructure intentionally: plain text forms the substrate that underlies, connects, and outlives higher-level applications. For people like you and me---whether we are writers or not---choosing to work with plain text is a political choice about memory and power, not a mere nerdy preference about file types.

It has been over six years since I moved all my writing to plain text and Git. Before that, my life’s work sat in one folder, spread over a handful of ‎.doc and ‎.docx files. Now, plain text is the lifeblood of everything I write—a choice to live closer to the infrastructure layer where I retain power over time, interoperability, and preservation. The alternative is renting them from whoever owns the fancy app.

An extract of the writer's git commit history © Theena Kumaragurunathan

Why does this matter?

In my last two columns, I spoke about how Emacs interfaces with my work: and using it for writing my next novel ; put simply, why I choose to work on Emacs in the age of AI tools. None of my Emacs-fu would be possible without plain text and Git sitting underneath.

Most of us are told that platforms will take care of our work. “Save to cloud” is the default. Drafts live in Google Docs, outlines in Notion, images in someone else’s “Photos,” notes in an app that syncs through servers we don’t control. It feels safe because it is convenient. It feels like progress: softer interfaces, smarter features, less friction.

The cost is deliberately obfuscated.

You pay it when the app changes its business model and the export button slips behind a subscription.

You pay it when comments you believed were part of the record are actually trapped inside an interface that will be sunsetted in ten years.

You pay it when a future collaborator has to sign up for a dead service—if that’s even possible—just to open a reference document.

You pay it when your own older drafts become psychologically “far away,” not because you are ashamed of them, but because the path to them runs through expired logins and abandoned software.

A repository of written work hosted entirely on proprietary, cloud‑bound software is a studio that dies when the companies behind it do—or when they decide that their future no longer includes you.

If you want your studio to outlive you, you cannot outsource its memory to platforms that see your work as a data source, a training set, or a metric. You need materials and tools that privilege longevity over lock‑in.

The Studio as a Text Forest

Showing my writing studip built on git

Plain text works because it is not sexy. It is not “disruptive.” Good. That is precisely why it is so important.

A text file is one of the most durable digital objects we have. It has remained readable, without elaborate translation, across decades of hardware, operating systems, and software ecosystems. It is trivial to convert into other formats: PDF, EPUB, HTML, printed book, subtitles. It compresses well. It plays well with search. It fails gracefully.

When I began moving my practice into plain text, I was not thinking about posterity. I was thinking about control. I wanted to pick up my work on any machine and carry on. I wanted to stop worrying that an update to a writing app would quietly rearrange my files. I wanted my drafts to be mine, not licensed to me through someone else’s interface.

The result is a studio structured less like a warehouse of finished products and more like a forest of living documents.

Each project—work‑in‑progress novels, screenplays, this very series of essays, research trails—lives in its own directory inside a single mono‑repo for all my writing. Inside each directory are text files that do one thing each: a chapter, a scene, a note, a log of cuts and revisions. The structure is legible at a glance. You don’t need me to draw a diagram or sell you a course. Anyone who knows how to open a folder can navigate it.

This is not nostalgia for a simpler computing era. It is about lowering the barrier for future humans—future me, future collaborators, future scholars, future strangers—to enter the work without first having to resurrect my software stack.

Plain text gives us a chance to build archives with the same openness as a box of annotated manuscripts, without the paper slowly turning to dust.

But text alone is not enough. A studio that outlives the writer needs a memory of how the work changed.

Version Control as Time Machine and Conversation

Linus Torvalds probably never intended Git for use by writers. And perhaps that is why I view it as almost possessing magical powers. You see, with Git I can talk to my future self, and my future self can talk to my past self.

In software, version control lets teams collaborate on code without stepping on each other’s toes. In a solo writing practice, it becomes something else: a time machine, a ledger of decisions, a slow, ongoing conversation between different iterations of the writer.

Every time I hit a significant point in a project—adding a chapter, making a painful cut, restructuring a section—I make a commit. I write a short message explaining what I did and why. Over months and years, these messages accumulate into a meta-narrative: not the story itself, but a veritable documentary of how my stories came to be.

When I open the log of a book or a long essay, I can scroll through those messages and see the ghost of my own thinking. I see the point where I abandoned a subplot, the week I rewrote an ending three times, the day I split a single swelling document into a modular structure that finally made sense. It is humbling and reassuring in equal measure: it shows me that good writing isn't a result of strokes of inspiration but sitting down consistently to wrangle my writing brain.

At some point, selected manuscripts from this mono‑repo will be made publicly available under a Creative Commons license.

When that happens, I will not just be publishing a final text. I will be publishing its making. A reader in another part of the world, years from now, will be able to trace how a scene evolved. A young writer will see that the book they admire was once a mess. A collaborator will be able to fork the repo, experiment with adaptations, translations, or critical editions, and perhaps send those changes back.

Version control turns my writing studio into something that can be forked, studied, and extended, not just consumed.

This stands in stark contrast to the way most digital platforms treat creative work today: as a stream of “content” to be scraped, remixed anonymously into generic output, and resurfaced as something merely “like” you. When your drafts live inside a proprietary system, you are not only dependent on that system to access them; you are also feeding an apparatus whose incentives diverge sharply from your own.

A Git repository of plain‑text work, mirrored in places you control, is not magically immune to scraping. Mine has been private from the moment I created it, and it will remain so until I am ready to open parts of it on an instance whose values align with my own. Even then, determined actors can copy anything that is accessible. The point is not perfect protection. The point is to design for humans first: to make the work legible and usable to future people on terms that you have thought about, instead of leaving everything at the mercy of opaque platforms.

Designing for the Long Afterlife

What does it mean, practically, to design a studio that outlives you?

It does not mean embalming your work in an imaginary final state. The texts we now call “classical” did not survive because someone froze them. They survived because people kept copying, translating, annotating, arguing with them. They survived because they were malleable, not because they were pristine.

If I want my work to have any chance at a similar afterlife—not in scale, but in spirit—I need to make it easy for future people to touch it.

For me, that means:

  • The core materials of my work live in plain text, organized in a directory structure that makes sense without me.
  • The history of that work is kept in Git, with commit messages written for humans, not machines.
  • The repositories I want to be accessible are published under licenses that explicitly permit study, remixing, and adaptation.
  • The studio is mirrored in more than one place, including at least one I self‑host, so its existence is not tied to a single company’s fortunes.

Notice what this does not require. It does not forbid me from using GUI tools, publishing platforms, or even proprietary software where necessary. I am not pretending to live in a bunker with only a terminal and a text editor. I am saying that the source of truth for my work is kept somewhere that does not depend on the goodwill of companies for whom my creative life is just another asset.

This is not an overnight migration. It took me years to get from a single Dropbox folder of ‎.docx files to my current setup. The important part was the direction of travel. Every project I started in plain text, every journal I kept as a folder of files instead of a locked‑down app, every book I moved into a Git repo rather than an opaque project bundle, was a step toward a studio that a future human could actually enter.

A Quiet Resistance to Big Tech's Power

We are entering an era where large AI systems are trained on whatever they can scrape. The default fate of most creative work is to be swallowed, blurred, and regurgitated as undifferentiated “content.” It becomes harder to tell where a particular voice begins and the training data ends. As more of the public web fills with machine‑generated sludge, it becomes harder for human readers to find specific, intentional work without passing through the filters of a few large intermediaries.

A self‑hosted, plain‑text, version‑controlled studio will not stop any of this by itself. But it is a form of quiet resistance. And at this point in our collective history, where the same infrastructures that mediate our creative lives are entangled with surveillance, automated propaganda, and the machinery of war, even small acts of refusal matter.

Moving a novel into plain text will not topple a platform. Hosting your own Git server will not end a conflict. But these choices shape who ultimately has their hands on the levers of our personal and collective memories.



from It's FOSS https://ift.tt/8I1jalH
via IFTTT