Post icon AI Will Not Eat You

Whether you have sought it out or not, I imagine you have seen plenty of "AI bloggers selling courses online." The most common narrative is, "This is the era of AI. If you don't learn now, you will be left behind," as if the world will end tomorrow if you don't buy their course today. Does this feel at all familiar to you, the reader?

Let's try this: "Don't let your child lose at the starting line." "First grade is the most important year." "If they fall behind in second grade, they'll never catch up." (Please mentally fill in the rest for third through sixth grade.) "The transition to middle school is a crucial moment in life." (Please continue to fill this in all the way to the college entrance exams.) Once in college, you'll be asked, "How can you sleep at your age?" And after starting work, someone else will say, "If you don't learn how to use AI, you're finished!"

Wow. Cool.

Post icon The Next Step in Education

With the emergence of the Deepseek R1 model, many of my previous assertions about large language models have been overturned. For instance, the development paradigm I used in "Alice Run," has already shifted.

Previously, when I presented a development request to a large language model, I had to trim my code, carefully extracting and laying out the core of the problem for the model to process. Now, I only need to paste several pages of code related to the business logic all at once, and the model can reference the relevant implementations on its own to complete the necessary development work. When using a relatively complex language like Rust, the model makes almost no mistakes. The few errors that do occur require only one or two simple corrections to produce highly usable results.

While marveling at the tremendous impact of the open source model on this generation of technology, as a writer in the field of education, I feel this is a good opportunity to discuss the influence of this technological leap on education and the changes we need to make.

Post icon What Should the Zune Player Look Like in 2024?

This is a topic I have discussed at length with NovaDNG. We have had many debates and fantasies about it. Just a few months ago, we decided to turn those wild ideas into reality. Rune, a project aimed at recreating the essence of Zune using modern technology stacks, was kicked off.

In fact, I've attempted to develop a player more than once, but my limited technical skills and overly picky technical taste led to two or three abandoned projects. However, with the emergence of powerful models like GPT and Claude, building a "space rocket" is no longer an unattainable dream. If you’ve read the development report of Alice Run!, you should have an idea of how astonishingly these large language models can solve development problems.

Riding this wave, I embarked on a nearly frenetic development journey over the past few months.

Post icon Alice Run Project Report: An Exploration of Digital Health and Motion-Based Media

During the May Day holiday, I started tinkering with some quirky stuff again. This time, I built a motion-controlled visual novel system powered by Joy-Con controllers and PC.

This project actually began two years ago. I've played many "motion games" on the Switch, and they're all fun. However, none achieved my ideal of "controllable aerobic exercise," so I thought, why not make one myself? In fact, I had a very successful "weight loss experience" once in my life.

During a winter break in middle school, a chubby kid who couldn't do any sports successfully flattened his protruding belly and became a genuine "skinny monkey." The method was incredibly simple: run in place while watching TV, one hour a day, for a whole month. When I put on clothes at the start of the new semester, I discovered that all the "fat" on my body had disappeared.

It all started with a bet between my classmate and me at home - who would collapse first from running in place for an hour. The process wasn't tiring at all, just pure sweating. This was probably the first time in my life that exercise triggered endorphin release and gave me a sense of "happiness." Since there was no uncomfortable "gasping for breath" feeling, I stuck with it.

Post icon Unifying the User Experience of Ionic Apps on Android through `GeckoView`

Using a web technology stack to draw GUI has basically become a recognized solution with the most balanced cost-effectiveness. Whether it's desktop, mobile, or even the operating system you're using, there's an abundance of WebView. This "trend" started blowing a decade ago (counting on fingers—) when I was still a college student. Intel created something called XDK, and IBM even tailored its own embedded WebView engine called CrossWalk (although it's no longer maintained).

After so many years of development, developing hybrid applications with a web technology stack on mobile is still a very troublesome thing. There are two main reasons for this situation. On the one hand, Chromium itself is highly coupled with the Android system (there are some APIs in Android specifically for Chromium), the internal engineering practices are chaotic, and it's extremely difficult to trim. On the other hand, Android itself is highly fragmented, and various "developing countries" and "self-researched" "OS" will also cause trouble in various ways, such as the notorious MIUI, whose built-in browser seems to follow rules but pretends to be a higher version browser, and some community ROMs cut off some browser APIs for unknown reasons.