The place where random ideas get written down and lost in time.
2020-02-02 - Godot Game Engine
Category DEVTutorials / intro / 2d game step by step:
https://www.gamefromscratch.com/page/Godot-3-Tutorial-Series-Index.aspx
https://godotengine.org/download/windows
The download is not an installer, it’s the full thing just zipped in-place.
Which means if I won’t use C#, just use the standard thing and I can always trivially change later.
“Godot… standard vs Mono C#” -- the difference is that the 2nd supports everything(?) including C#, whereas the former is limited to using GDScript (a python look-alike). I’d venture the C# one is designed to ease people coming from Unity.
The C# version requires the Mono SDK: https://www.mono-project.com/download/stable/
Or the Visual Studio Build Tools: https://visualstudio.microsoft.com/downloads/?q=build+tools (which seem packaged separately from the full VS.Net?)
Thus going with Standard for now.
Doing what?
Either 2d or 3d (fixed perspective, toy like rendering), something with trains and tracks. First do a prototype demo to see if I can even render what I like. Game wise I'd like a tower defense/attack.
2020-01-24 - Home Projects
Category DEVPending home projects, in that order, that I’d really like to see completed this year:
- React Otter:
- For the demo, finish learning how to do the data-server backend part.
- For the MVP, implement a cron task sending tweets & creating images.
- Conductor 2:
- Finish the Groovy design. It has potential, yet right now it is in an unfinished design state.
- Rework the implementation with a focus on tests.
- ESP32-CAM:
- Work on video track detection, with the smaller goal of doing a crossing gate controller.
- The desire is for that work to be later used to control Fairfield or other automation.
Specs:
- Target: Froyo Android 2.2
- Keep screen on when plugged in.
- Release screen lock when on battery.
- Remember timer / restore after restart. Compensate for time off (if accurate).
- Change timer color:
- white default.
- orange if > 24h
- red if > 48h
- red blink if > 72h
- Email updates notifications.
And yes, the target really is Froyo Android 2.2. I like to reuse old phones as dedicated one-off appliances.
One discussion on the Arduini mailing list centered about the use of Arduinos to commonly do Crossing Gates mechanisms. The projects are considered “trivial” as they involve commanding servos (to move gate arms) and blinking LEDs for the crossing lights. There are modules to do sound too. Both are ubiquitous with Arduinos and well designed for that.
Typically inputs come from photoresistor, IR leds for detection, or less commonly from existing block detection modules.
This makes me think an ESP32-CAM would be almost suitable for that. What form would it take?
- First, I should wonder whether something like OpenCV is even needed here. I’m not familiar enough with it to know whether it would help. On a fixed camera, simple naive image detection would be enough to see if there’s any motion change on a masked area. A simple average+diff algorithm would probably be good enough.
- Second, implementation wise, C or MicroPython? I’m surprised we don’t see more of the latter. It’s certainly slow yet good enough to just blink a LED or move a servo. It would not be suitable for image analysis, though.
- The ESP32-CAM MicroPython fork made it clear this is done by taking a specific commit on the ESP32 repo, not all of them are suitable; then add the camera module in C as a Python module, then use it from Python as a black box.
- At that point that does mean that developing on the module is best done in a separate C sketch. And from a distribution model, an Arduino sketch is still the easiest way to deliver this to an end user who needs to tinker with it. What does using ESP IDF or even MicroPython add here?
- MicroPython does add that it becomes trivial to talk to JMRI via a JSON client/server, but that’s not exactly hard from C either, just a bit more clunky. One of the original demo suggestions was to split both.
- One does wonder why Swift or GoLang is not more popular on these. They would be good fits. Swift is probably too Apple centric. There’s a TinyGo.
There is one obvious downside to using an ESP32-CAM to do a grade crossing module, and that’s the location of components. The camera is attached to the ESP32 with a short ribbon (either 1 cm or the 10-ish cm one). This cannot be much longer. And mounting location would require a good view of the track that is to be detected. For example directly on the ceiling, or maybe 1-2 meters above. Track level would be possible but would probably be hard. Perspective/distance means that scale/precision won’t be the same in different places of the camera view. A direct view of the track is needed. That precludes having tunnels or bridges in the way, as well as potentially operators.
The connections for the LEDs and servos could be longer, using twisted wire pairs it would not be unreasonable to have 2-4 meters or more.
Another option is to use more than one module, for example an ESP32-CAM can be mounted on the ceiling, communicating to another Arduino next to the crossing with a simple RS485/RS232 link. In that case, a much longer distance can be achieved.
Just completed this yesterday, and it works as expected:
https://bitbucket.org/ralfoide/misc-exp-public/src/master/android/bg-demo/
(In retrospect, I should have called the demo “fg-demo”).
First off, I’d say duplicating the job-demo project was easy and the way to go: dup the folder, remove app/build as well as anything excluded by gitconfig, then do a simple grep/sed on “job.demo”. The only 2 places where it was used was in the app name.
The demo uses a “startForegroundService”, a regular Service, and a “startForeground” from the onStartCommand of the service, with a notification.
The service is started when the activity is started/resumed. The service is only stopped when the stop button in the activity is explicitly used. The service itself runs a thread that updates a counter. The activity binds to the service to give a callback, used to update the counter on screen.
As this is both a “bound” service (for the activity) and a “started” service, it keeps running until it is both stopped and unbound from the UI.
There are a few cases where the system may kill & restart the service immediately.
Which project make sense using that, and which do not?
In the past, I’ve been using background services for applications that:
- Do some kind of system control and need to run in the background all the time.
- Those that have expensive/complicated startup, for example with network connections to hold the update state.
In these cases, it’s a bit wasteful to stop everything as soon as the user switches between apps.
In a post-Android-O world, one should balance between this and using a JobService, or neither.
How would I use these for current, former, or future projects:
- DccThrottle: Foreground service is a logical choice. User is expected to switch between apps, not lose control, and be able to change back immediately. Service only needs to be started once at least one engine is in command; stopped by asking for confirmation when trying to exit the app (or the double-back pattern).
- RTAC: Background service is a logical choice, but not necessary. The app is supposed to be used in foreground. Instead it could just use an app singleton thread.
- T3: JobService + AlarmManager.
- Bearing: This was using a background service before. The new version just uses an activity-centric update. State is lost/reconstructed when outside of the activity.
Follow up work:
- Change on-going notification to “App Blah is Running”.
- Change notification setting so that it does not beep at me (lower priority).
- Add 1 or 2 notification actions: “quit now” vs “open”.
- Open would do what the current notification click does, bring the activity.
- Quite would both stop the service and kill the app (or more exactly stop the activity if it’s open).
2019-11-28 - Quote Otter choices
Category DEVOne of the goals of Quote Otter is to try some “web stuff”.
Choices to be made:
- Use nodejs instead of PHP / Rails / GoLang. How does it fit with Apache?
- Can I have scaffolding in nodejs?
- Frontend: Angular vs React.
- React:
- React router.
- State management: Redux or MobX.
- Flutter web? https://flutter.dev/web
There are two goals, and they are not conflicting: a/ learn something; failure is an option, and b/ if I get it working, it should be (re)usabl. So there’s some value in choosing a “common” framework and not something fully esoteric. Looking at you, Flutter.
To be clear, I have two choices to make: backend and frontend.
Frontend:
- Let’s try React.
- As guides explain, React is not a full framework and needs additional libraries to do stuff. There are a myriad of choices, so let’s get some I see floating around.
- Routing: React router.
- State management: Redux or MobX.
- Fetching data (from backend): jQuery.
Backend:
- We already run Apache.
- Let’s try nodejs for a change.
- The way to “integrate” nodejs with Apache is to run both. Nodejs runs on port 8080 and setup Apache to access it using a proxy/reverse proxy config (example here).
- Some people “love nodejs” (too much?)
Local Development:
- NPM / nodejs is apparently the way to go. Doesn’t matter where it gets deployed after.
The choice of nodejs means I can be server-agnostic and don’t even need to dev on a server, can be localhost on any machine.
More links:
- React: https://daveceddia.com/test-drive-react/
- React: https://www.tutorialsteacher.com/nodejs/nodejs-tutorials
- SQL vs NoSQL: https://www.thegeekstuff.com/2014/01/sql-vs-nosql-db
One thing that the React examples do not cover is the equivalent of an easy table-to-rest scaffolding as made popular by Rails. That’s kind of what I want here. For “trivial” tables, I want a rest setup with a typical list/add/delete/edit flow (CRUD: create, read, update, delete) + scaffolding. There’s a lot of value in that for rapid prototyping. Whereas Rails is a backend+frontend combo, I’m guessing here we’ll have to deal with a “nodejs database” backend and a matching “react frontend”. What I want is something that sort of works together. ⇒ From tutorial above, nodejs uses “drivers” which are installed. I’m going to guess they each have their own API.
Speaking of “SQL vs NoSQL”, in my simplified view:
- In an SQL database, each record has defined columns, some of which are indexed. Complex SQL-based queries can be written that act on the column data. The queries inherently “understand” the meaning of each column.
- In a NoSQL database, each record is merely a key/value. There are no queries, except maybe operating on a range of keys. Values are akin to opaque blobs.
- SQL is useful in an application that needs to filter the data when querying it. For example only show entries from a given camera, or sorted by username. I am going to guess that some NoSQL databases probably offer secondary index columns to do just that, where data is often duplicated from the main record into that column just for that purpose (e.g. Bigtable works like that, IIRC).
- NoSQL is ideal when the data already exists in the form of a JSON blog. Just compute some kind of unique id and that’s the key. Then retrieve it all at the same time. When the value payload grows too large, split it in 2 (e.g. list view vs detail view) and store it in the same row.
2019-11-27 - Android “Job Demo”
Category DEVCompleted Android app “Job Demo”:
https://bitbucket.org/ralfoide/misc-exp-public/src/master/android/job-demo/
This demonstrates enough of what I want:
- Using JobService and match a required time using a minimum time delay.
- The “old way” of using the AlarmManager has not been used.
- Handling Boot-Received intent with a receiver.
Accessorily, this also has:
- Dagger properly setup with components and one prod-vs-test module.
- Robolectric for the app, activity, with dagger.
- RecyclerView from androidx not using the pesky ListAdapter.
- Using androidx Room DAO for persistence database.
Where to go from there?
As far as a JobService demo is concerned, the current state is good enough. There are a few things that could be done better such as having more robolectric/dagger in place (for example a provider module to replace LocalTime.now() calls during tests) and the event log is a weird mix of both an in-memory list and a Room persistence database (on purpose, to showcase both options).
This is actually a good base to do a T3 rewrite. Is this really what I want?
Per the previous entry, another “small” demo app is to do one running an actual permanent background service, for the typical app case such as dcc controller and others that need to keep a permanent controller running at all times.
Upon scheduling events for many hours during the night (e.g. one per hour), it was clear from the log they did not fire at the expected time when the phone was in Battery Saver mode. Instead they were scheduled all at once after I removed the Battery Saver mode. To be clear, that’s not particularly surprising, but it would require a bit of user education to explain it.
So the next step is to try the Alarm Manager to see how it behaves with the Battery Saver mode.
A few of my older apps rely on a common principle of having a permanent background service. This broke with the API changes in Android O & P (API 26).
Trying to fix T3 & RTAC in-situ has been a bit of a PITA, so it’s time to do an isolated “canonical example” app to figure that out. But first I need to understand why I have a background service and how to adjust it for these API. Also to clearly list which API levels are the issues.
Existing cases:
- RTAC:
- Needs to run on Hi10 tablets, Android 5.1 (API 22, Lollipop MR1).
- Permanent background service.
- Designed to keep the app alive at all times, even if not in foreground. Keep all the network connections active. Wakelock to make sure the tablet never sleeps.
- Notification shown when main activity is in background.
- DccThrottle:
- Same as RTAC. Permanent, active even when in background.
- Bearing v1:
- Similar to RTAC. Permanent, active even when in background.
- T2:
- T2 is API 9+, T3 is API 26+.
- Job-oriented background service.
- Started by “check” menu, alarm, or receiver, only for the duration of the operation.
The needs for T3 are very different and ideally we would not use a service for that if we can avoid it. The only reason a service was used here is because that was the recommended design at the time. I understand OS requirements can change, but it’s somewhat ironic an app breaks when it was following the recommended design, with undertones of “you’re doing it wrong”.
The T3 case was handled in 2019-07-08.
It uses:
- JobService to schedule a task.
- Still relies on an UpdateReceiver starting an actual Service.
- Boot_Received triggers the UpdateReceiver.
Now the question is whether a service is even needed at all. Does the new API offer something else?
https://developer.android.com/about/versions/oreo/android-8.0-changes.html#back-all
In theory that doc seems fine: start a service using startForegroundService, then call startForeground with a notification In practice that seems to fail, and the service still gets killed 5 minutes later when testing on an Android O device or emulator.
There’s a lot of that on SO:
https://stackoverflow.com/questions/49637967/minimal-android-foreground-service-killed-on-high-end-phone
Thus my suggestion is to create two simple samples/demo apps:
- T2 case: a “short job”, triggered by a button, an alarm (specific time), or a boot receiver.
- Permanent case (e.g. RTAC, DccThrottle, etc) of a permanent “foreground” service.
For the latter, integrate some external lib such as Doki that shows details from dontkillmyapp.com for a specific device.
https://bitbucket.org/ralfoide/misc-exp-public/src/master/android/ is a good place to store these.
2019-10-22 - Dev State of the Union
Category DEVWhat is the dev that I want to be doing right?
I want some projects that I can demonstrate visually -- most of my stuff is often very cryptic, abstract, with little to see. By that I mean have a nice write up, maybe a video, and have a little collection of them. Experimental projects would be best for that.
- ESP32-CAM in train context.
- Could be as a simple web server demo with an android viewer. For example placed in scenery, or from a top view.
- Power from USB brick.
- Point is to show the simplicity and versatility of this.
- ESP32-CAM with OpenCV.
- That requires a bit more research.
- Use integrated web server to see output.
- Point is to demonstrate we can do embedded detection on the ESP32.
- ESP32-CAM as “occupancy detector” without OpenCV.
- This is the simple “things move in the image” kind of demo.
- Use integrated web server to see output.
- ESP32 with OLED send a JMRI JSON.
- The OLED is nice because it adds a visual component. It makes it less abstract.
- Point out that the OLED is a nice status/debugging tool.
- This is the kind of thing to do in MicroPython as it becomes trivial.
One tutorial for later would be “hacking the NCE AIU”, for example for the EB1. This requires more thought to explain why it’s useful for. It has less appeal (lesser known hardware).
2019-08-31 - ESP32 Camera Modules
Category DEV4 x ESP32 boards compared, all with an OV2640 camera module:
https://www.youtube.com/watch?v=5IhhyJjjCxo
Possible applications? I’ve been looking at the cheapest way to add video to train layouts with at least 2 goals, if not more:
- Video-based train automation.
- In-car video streaming.
- Single/multi view on remote monitor. E.g. monitor hidden staging yard(s) or tunnels, or display activity on a remote monitor -- any place where there’s motion.
For each of these, my concern is footprint, price, and power. Each time I operate under the expectation I can offset any processing to a local laptop or tablet or similar.
For example, for streaming the expectation is that the camera can either send a constant stream and the computer does motion detection, or the camera can do basic detection and only transmit when there are changes. A mix is possible (e.g. non-sophisticated change detection on the camera, then a better filtering at the computer level).
In this case, the value of the ESP32 is fairly reduced. It only needs to act as a crude frame server. The benefits are a small footprint, and real 1080p or better.
The contenders are mini pinhole wifi cameras like the one I tried earlier. This would be an almost ideal choice if it were really a 1080p and not a marketing gimmick using a fairly old 720p @ 30 Hz.