@artdeco/blog logo

Pages 2020

On This Page

Loading sharing buttons...
hide menu show menu

29 February Sat

Today I was implementing the hot reload, using the strategy outlined yesterday. It worked out pretty well without any problems. Instead of updating all methods of a class, we can iterate through all getOwnProperyNames of the new prototype to compare older methods with newer ones, and replace only those that differ. Same works for exported functions.

Later on I was thinking about a slogan for a campaign against degeneration of Open Source. I've finally come up to understand the problem, that Open Source has become unbearably leftist. Internet is a haven for left wing where all they do is talk about their community yet with complete lack of quality of work. Then like truly totalitarian regime they get together to silence everyone who undermines their newly established position of power. It's messed up, I don't understand how humans can be so stupid. There's inclusivity, which is one thing, and there's left which is illiberal themselves as they only "help" those, so that they can appear good. Nothing prevents conservatives to be inclusive, on the contrary, they have enough decency to respect other people because of their innate inner strength, and not because some code of conduct said so. We also not afraid to say things as they are, yet it is the left who impair freedom of speech, when it's not comfortable for them.

This whole theme is going along the lines of Nietzsche's hypothesis of transvaluation of values:

Yet the method was only appropriate to a nation of priests, to a nation of the most jealously nursed priestly revengefulness. It was the Jews who, in opposition to the aristocratic equation (good = aristocratic = beautiful = happy = loved by the gods), dared with a terrifying logic to suggest the contrary equation, and indeed to maintain with the teeth of the most profound hatred (the hatred of weakness) this contrary equation, namely, "the wretched are alone the good; the poor, the weak, the lowly, are alone the good; the suffering, the needy, the sick, the loathsome, are the only ones who are pious, the only ones who are blessed, for them alone is salvation—but you, on the other hand, you aristocrats, you men of power, you are to all eternity the evil, the horrible, the covetous, the insatiate, the godless; eternally also shall you be the unblessed, the cursed, the damned!"

Leftism is the most annoying thing in the world and it should be stopped. Yet I guess it's what people like, and you can't change it. But online, this is the direction where they've taken Open Source, because there's no one preventing this. There's no elections and no campaigns. They've proclaimed themselves leaders and think it's how things should be but it isn't. If they didn't harm Software Engineering and JavaScript, they can do whatever they wanted to, but they are directly involved there and call themselves software engineers "evolving javascript". Who gave them the right?

put open source right
The message is to restore Open Source's values. Transvaluation is Nietzsche's concept however the word is too complex, and he used it more in a negative sense, as to the crime that priests committed against humanity. Re-evaluation is an alternative, but it doesn't enough meaning as to re-evaluate into what. Another one is "patch up Open Source" which has good a nice pun, but it patch up is applicable either to relationships, or to repairing something but not very well. Put Open Source Right is 100% perfect, as we're fixing it's values, and fighting the leftism that has come to be the driving force of it.
I like the logo also because it's got an arrow in Zig-Zag style, which is very art-deco. The inspiration came from this picture on unsplash by the term "right". It's good to search for inspiration I guess. It's also linking to Greek myths, which I'm reading now, where Zeus was throwing light bolts as his weapon, and Zeus is a 100% warrior type, strong and powerful. No wonder Nietzsche loved Greeks so much, for their ability to be in sync with their feelings and instincts, instead of self-hatred which the left imposes on humans today. It's proper brainwashing what they are doing, but we're gonna be the first to start talking about this in Open Source.
concrete man statue

I was so excited about my first political slogan 😄 Art Deco is not just a development company, it's a PR company. So little people understand PR this days, especially in Open Source. They think that giving software for free, puts them above their users. Art Deco™: Premium Open Source®.

section break

28 February Fri

In the morning, I had to get an X-ray. I slept little after going to bed late at night and getting up by 9:30. It's kind of good to sleep less than usual but this requires catching up on sleep later during the day. Also if it's not something important, it's pretty much impossible to wake up if don't really have to. Sleep is absolute bliss and perfect happiness. Am I escaping reality? Only because of problems, so when I solve them I hope it'll be alright and I'll welcome each new day happily.

On the way back, was thinking about hot-module-reload, which I've been experimenting before. There are 2 sides: updating classes and updating functions. I focused on the second one — how can a function be dynamically updated in JS? The source code of the function needs to change, how can we do it? When the though "It's impossible" comes to head, it's a good indicator that you're doing it right. You're challenging yourself and even if it's impossible, you can find a workaround.

When I came back, I tried searching for dynamically changing a function, but the only way is to wrap it in another function which I don't want to do since when debugging with hot reload, this will result in an extra debugging step when stepping in, and it's not perfect experience. Also I want it to be implemented as natively as possible. But if changing the function is not a solution, we need to be changing imports/exports themselves.

After a bit of research, I got to understand that exports are binding: this means that when we export something, it's a reference to the exported object that is imported, and not an object itself. The best example is the following simple primitive export:

// module.js
export let i = 4
export const up = () => { i++ }

// index.js
import { i, up } from './module.js'
console.log(i) // 4
up()
console.log(i) // 5

Here although we imported i, we actually imported a binding to it, and not the primitive value. When we update the value via the up method, its reference also changes. This wouldn't be the case in Node.JS, for example, when using CommonJS. I guess the use case is pretty rare, but it lays the basis for how the hot reload can be implemented: be can update the binding in the actual file, and every other file that imported and used it, will now have references to the new object. However, it's not possible to update export const bindings, because they are constants. So we'd also need to rename those consts into lets when transpiling code. It changes the nature of code, but enables hot reload, and it doesn't matter that much if it's a const or let, since we're using it for development purpose only.

section break

27 February Thu

Exactly one month from my birthday, where I sat at a pub drawing a diagram for photo-uploader.

Really intense day although I don't even know what most of the time went on. Good stuff waka time is here for me for days like that. I started the day off by beginning an article on medium, and writing an intro to the new post, Node.JS Image Upload Microservice, it's going to be big and good. Having Idio makes making microservices a bliss, and it's a good business model -- to help startups set up their infrastructure, now we have Akashic.page for email subscriptions and web-pushes, and image upload. There's plenty more to explore, and provide consulting around those. The most amazing thing is that they are made with bespoke Art Deco software. London's calling.

After the article, I got back on to working on the app, but there wasn't much to improve there hence anxiety again, so I needed to upgrade the MNP template for Idio, so that I can talk about using in the article as it's the easiest way to get started with it. This means I had to take bits out of akashic.page and start putting them into the template: types, tests and frontend were updated. I then took out jsonErrors middleware and made it common in the Idio server itself, to be reused. Found a bug when status code 500 was treated as client error. The chance of finding the error was actually 1% as the condition was <= 500 so I could've put any other code above 500, and it would have passed :P

anton

Mentally I'm just feeling better because I stopped expecting anything from people it's clear they don't even know the meaning of class. I thought that people kind of posses culture turns out only a couple of people develop it. It sounds so fucking vain but how else can I speak about it if it's true. One thing is being stuck about it, another when I can literally feel a lot of passion about it, about nobility and culture. I don't know where it's from but it's the second time I get this euphoric feeling, and it comes after I'm less confused about the state of the world and understand the condition we're in. there's no problem i just kept forgetting and measured everything to my own standard. proper culture is everything, art, museums, music, books. not to say that family guy is not fun. but i don't know, just feeling better when knowing that i'm 5 years ahead of everyone with my work, and i don't have to prove it we'll see in a few years who's a weirdo then. i love down to earth people who could appreciate it and learn from me instead this fascist community who think they have "culture". days like this...

After Idio, went back to photo uploader. I already wrote some code a few years ago to rotate images based on their exif with CSS, called vie-orientation-image, which basically creates a container for an image, on which a transform is applied via CSS (but because the bounding box doesn't get rotated, it overflows), which hides the bounding box. But I think this is now possible to do on a canvas, because the upload already draws preview using canvas, to unload memory (a 250px wide photo is much smaller than 2000px which would lag in the browser).

previews in the uploader

The front-end app is using Bootstrap, so some classes are set as btn btn-sm, for example, and additionally I followed the naming convention for capital letters in class names from CSS rules, e.g., ImageInfo. Therefore, I wanted to implement my old idea to integrate property names to class conversion in the transpiler. The main package that does JSX transpilation is @idio/jsx, which is then used by:

  • FrontEnd, for live serving of JSX files from the server: DEVELOPMENT.
  • Depack, when transpiling JSX prior to a build: PRODUCTION.
  • ÀLaMode, during server-side rendering, and when building components for publishing: DISTRIBUTION.

JSX is massive, and is used in every area of web development. I linked frontend and updated class names in the photo-uploader:

import { $Added, $Uploading, $Error, $HasInput, $Uploaded } from './styles/photo.css'

render() {
  return (<Copy error={error} hasInput={hasInput} processing={processing} src={src} uploaded={uploaded}>
    <div Image position-relative w-100 h-100>
      {!src &&
        <span PreviewLoadingSpan position-absolute text-center>
          {LOCALE.previewLoading}...
        </span>}
      <img src={src} mw-100 mh-100 />
      <span ImageInfo style="top:0;left:0;">
        {name}
        {date && <br/>}
        {date}
      </span>
      <span ImageInfo CloseSpan onClick={onRemove} overflow-hidden text-center>✕</span>
      {!result && !error && progress === null &&
        <BottomLeft style="background:transparent;" pl-0>
          <a btn btn-light btn-sm onClick={this.uploadHandle}>
            {LOCALE.upload}
          </a>
        </BottomLeft>
      }
    </div>
  </Copy>
}

Notice Image, btn, btn-light, etc. When I could use bootstrap classes I did, and otherwise the rules come from stylesheets.

.Image img {
  border-radius: 3px;
  transition: .5s;
}
.Image:hover img {
  box-shadow: 0 0 19px 1px white;
}

There's also an import of class names, as they will be renamed by Closure Stylesheets for production build. Front-end will simply export the names as they are (by parsing the stylesheet and extracting all classes based on \.[\w\d-_]+ regex). Only lines that don't start with whitespace will be used to check for class names, so that values like .5rem don't end up as class names. It's a hack since indentation is not required in CSS, but it is now hehe. Who doesn't indent rules. That could fail for media queries though, so need more robust strategy. Basically parsing CSS is not regex-space since it can have multiple opening {, as in media queries

@media (min-width: 500px) {
  body {
    color: green;
  }
}

So that we can't just look for what's between {} as it will match from first { to the next one, and not to the balancing one. But I've already solved that in JSX parser (it's a wonder it works). Fixed a few issues in JSX parser as well, making comments and blank properties possible. In other words, JSX received 3 new options: prop2class for conversion of CapitalLetter properties into class name, classNames to enable putting props into class name also, such as bootstrap prop, and renameMap so that those names could be remapped after Closure Stylesheets did its renaming job.

closure-stylesheets is now a package that contains API to run Closure Stylesheets. It was extracted from Splendid, but making a new package, documenting and compiling it is a bit of a chore, but I've mastered it, not much more could be automated, it's just work...

After front-end is updated, what is left now, is to be able to build with ÀLaMode, renaming the location of imported styles into the one provided by closure. It's needed to run TrapCSS on the bootstrap stylesheet so that only relevant selectors can be extracted. I've done it with Akashic by copying and pasting bits of code into HTML file, and running the library against it and boostrap. But I think I can create a script for browser, where I'll be able to specify a selector from which to get element, feed its innerHTML to trapcss against bootstrap, and repeat the process a few times for all possible states (e.g., hovering over a button). It can have minimal UI, and help to gather all selectors across interactions. Otherwise, I think it's a really difficult problem to do it pragmatically, since classes can be computed dynamically.

In other words, after I've managed to finish this work on JSX classes, CSS renaming and trapping selectors, I'll have a build operation for widgets, which fits into microservices strategy of web apps, and constitutes the client-side (browser JS) development process. On the one hand, there's Idio for back-end, on the other, Closure Stylesheets and Compiler for front-end, and linked via types that are maintained in XML, and can be documented in seconds. Put together, it'll become Irio, the best professional full-stack framework for NodeJS. And based on that, we'll be able to consult marketing businesses on setting up their simple, modern infrastructure and assist in development.

microservices for your business
TODO
  • come up with a strategy for automated edge testing that could populate a lot of values, i.e. specify a range to test the function against. there's a term for that.
  • provide JSX namespace for HTMLAttributes that contains bootstrap utilities.
  • write a guest post for dotsucks: open source sucks.

section break

26 February Wed

Today was a chilled day, started with a Twitter now I know why people hang out there, it's quite fun when people crack jokes, had a go myself... If you're not joking you bring nothing to this world.

The higher we soar the smaller we appear to those who cannot fly. Seems like Nietzsche is year 1 plan in Seagulls school.

A person gave me 50 claps on Medium for a post on compiling Koa. What legend. The rest? I show them how to compile Node.JS packages, and people don't want to publish it on a newsletter... People's arrogance is pulsating through the screen. Light years ahead of everyone they can't even click the significance of this. But those claps made my day. I know my audience and they've made the right choice :)

Why would anyone be such a dick as not to give people 50 claps on #medium every time?

From Twitter moving on to work. It's painful to get back to doing programming on a project after a break as there's a bit of anxiety of how to start but I'm glad I was procrastinating on GitHub checking open issues, on exif2css so that got me back to the photo uploader. Started working on the images.artdeco.app which is a microservice that can receive images and resize them, without having to install sharp on each other app that needs such functionality. I thought I could update Docker image to Node 12, but it wasn't starting since JSX wasn't being parsed properly. I thought it was because of the @depack/render which didn't get updated but the transpilation happens in ÀLaMode and not rendering therefore I'm still not sure what happened. Upgraded render package though to provide api.xml to expose the method via API, which was used later now.

Then Depack also had to be updated, and static-analysis because of that Node 12 bug with makePromise that broke pretty much all code in so many packages, but I'm getting back on track. That's the downside of compilation as it's the same as fixing package version, so that you have to go back to each package and recompile it to statically link new code which wouldn't have been the case if it was a dependency with non-fixed version.

Started using GitHub Actions which are really awesome: 7GB memory, Windows / Linux / Mac containers? Shut Up and don't take any of my money for 2000 free minutes per month. ci badge from actions

Idio also then required fixing as frontend middleware dependent on the resolve-package which was broken because of makepromise. So I upgraded dependencies, and built it also. The actions could be good for building with Closure Compiler since they're so powerful, but I feel like I need control of my build step so that I can manually run yarn lib/compile and yarn test-compile to be able to fix issues. Or do I and it's just a bad habit that wastes a lot of time and could be avoided by working on a separate branch, and running a build step on master after a merge? nah. Also integrated the newly refreshed render for SSR rendering into Idio itself and wrote some documentation for that. Documentary needed an update to be able to use multiple parts from an example, e.g.,

import dep from 'dep'
/* start example */
import Code from 'code'
/* end example */

(async () => {
  /* start example */
  console.log(Code.sayHi())
  /* start example */
})()

Also that required an update to stdlib which fixed the forks issue on Windows, because of the root path which I talked about in the previous entry. Didn't document this feature as I've stopped documenting documentary properly because there's this annoying bug with Wikis when forks do something silly and the whole documentation is generated twice which is extremely annoying but I can't figure it out.

For dinner had kale carrot soup with some chicken. Was ok. The soup was properly blended which is the first time I've do it like that.

The rest of the day, spent working on the image uploader, in particular extracting only the needed bits from azure-storage package that are required to PUT blobs on storage. In the end, I had 2 files, an understanding of Azure's REST and its signing mechanism, and cut 390 lines from yarn.lock by removing their library which is 25% of the whole dependency dept. Super happy about it. The process consisted of debugging what their client is doing and taking bits of their code. I've kind of adapted it for my own style, as the rqt will add Content-Type header using capital letters but signing worked for content-type and same with content-length however it's not important on their servers, they just want the signature to be correct. The goal is to then write an article about the image uploader to start that content rolling on Medium which is quite fun, and could be used to talk about Idio.

Removing 390 lines from yarn.lock by implementing custom rest api client for azure blob storage

TODO
  • ÀLaNode stopped working because I built it on Windows and the shebang ends with \r which breaks interpreter... The \r appeared because of automatic crln in git config, which basically means I need to switch it off, or run a regex on pre-publish script on windows to remove \r.
  • Find out what's going on with Documentary's forks which prevent proper compilation of its wiki.
  • Upload images on GitHub from image uploader, and hookup repo to GitHub Actions which will generate previews!
  • Extract the image rotation js code from vue gallery that actually changes the dimensions of the container when a photo is rotated.

section break

22 February Sat

Today was pretty intense. Started with wanted to fix failing Appveyor build for Zoroaster. This required looking into masks which were the reason builds were failing. From masks, I investigated forks as I initially thought that preprocessing doesn't get applied to inputs, but it wasn't the case. Instead, I added the normaliseOutputs option to forks so that outputs are consistent across platforms (replacing \n with \r\n on Win).

appveyor screenshot

During working on packages on windows, I cloned my Documentary widgets for the user directory, and the README wasn't compiling as JSX was failing. Therefore, found out about non-working JSX transpilation on Windows, and had to fix it. Pretty crucial so it's really good to keep a secondary Windows system for development just to make sure that the packages are truly cross-platform. JSX was fixed, and merged into ÀLaMode.

A lot of tests in ÀLaMode and masks had to be updated to be cross-platform too. This typically involves normalising forks' outputs and snapshots outputs also. Apveyor by default will clone the repo using input crln, but my config is auto therefore the tests were failing for Zoroaster, as mask results are extracted using EOL which isn't robust. For now, the auto strategy is good enough, but when \n is encountered on Win, instead of \r\n, mask will fail.

appveyor badge

Discovered a bug with FindPackageJSON also, which traverses paths up to root to find the location of a package. This was getting stuck in an infinite loop as the root was assumed to be / which is wrong on Win (e.g., C:\ is expected instead). This made Documentary as well as Depack freeze without any error message which isn't great. There's a very large number of Windows users therefore testing software on Win is absolutely essential. Just a few weeks ago, I was still using Node 8 and most of my packages were not working on Node 12 :facepalm: It was due to the Node's API being changes so that some functions were not accepting as many args as documented, whereas my package makepromise was checking if too many arguments were passed, and "Function doesn't accept that many arguments" were thrown A LOT. That probably alienated some people from using packages, and I learned a lesson: to setup CI for Windows + test matrix for Node 8 + 12. Even if I stay on Node 8 on Mac, there are people who run Node 12 on Win.

It's been known that console.log prints a \n and not EOL after the logged string. But another interesting observation is that template strings also always print \n regardless of the platform:

const t = `hello
world`

Counter-intuitive, even if the file is saved on a Windows machine, it the new line here is still \n and not \r\n. +1 to interview questions list ;)

A lot of time today was spent on testing readline interface as well as forks can accept inputs, and the way I was testing it before, was just by writing questions to process.stdout and then pushing answers to process.stdin. That was totally fine on Mac, but on Windows, it's buggy. You need to create separate readable and writable streams for each new question, otherwise Node will freeze if trying to do this through the main process's streams. It can work one time but then stop working for all next times when asking questions. I've updated reloquent so that a question now extends ReadlineOptions which means that for testing, I can be passing new streams, e.g.,

import { askSingle } from 'reloquent'
const output = new Catchment()
const q = askSingle({
  text: 'My girl, Where did you sleep last night',
  output,
  input: new Readable({
    read() {
      this.push('in the pines \n') // => answer
      this.push(null)
    },
  })
})

output.promise // => My girl, Where did you sleep last night

This also unloads logging from the main console when writing tests. A disadvantage is that if my API for reloquent supported a simple string as a question, I'll be passing an object instead so that I'm not covering the string feature any more. This can result in errors. So it's good to have another tests covering simple strings, but it's impossible... The solution is to create a fork and write to its stdin via a separate process. This will ensure 100% test coverage. Likely with Zoroaster's mask testing it's really easy to do. Also Documentary's forks can be used to combine this feature with documentation generation.

vs code screenshot

TODO

  • normaliseOutputs to fully replace all newlines, as currently the regex is [^\r]\n -> EOL but this doesn't work for 2 consecutive new lines like \r\n\n.
  • Update temp-context to create cross-platform snapshots, as the paths are written using a system separator which is annoying and requires to update paths manually.
  • Handle \n without \r on Windows for mask results extraction.
  • Fix Documentary and Depack for FPJ on Windows.

section break