2,985,205 events, 1,482,296 push events, 2,355,649 commit messages, 172,926,471 characters
Add self censorship to transparency report
Fuck you Facebook.
"9:20am. Let me chill for a while and then I will start.
10:50am. Actually, let me chill, have breakfast, and then I will start.
11:10am. Breakfast down. Let me do the chores. After that I will have the whole day to program in. Today's schedule will be finishing the validator and then doing the package loader.
11:50am. Done with chores. They took longer than I expected. Let me finally start.
11:55am. My thoughts are lingering on stocks for a bit, so let me put in the last few words on that. After this I won't ever mention this subject if I can help it.
Stocks like AMZN and TSM are my weak point. But merely that admission is a good sign that I should not be touching individual stocks at all. I mea,n AMZN went up who knows what in the past decade, but I try to imagine what it would be like to buy into positive momentum on the monthly scale and I am not getting good feedback. The fact that I am out of tune with what would be the market's biggest winners just because they take their time going up is a good indication that my approach is flawed.
With the benefit of hindsight, and I had an inkling that this was the case all along - I am mediocre trader. If I want to trade individual stocks, I should immerse myself in them. I should study them from the ground up and do in depth research. Me - I do not care about this at all. I do not care about individual stocks and yet I want to get benefits from them. This kind of egoism cannot be allowed.
12pm. But being mediocre is not the end of the world. My flaws are that I do not care about doing research. But my strengths are my decisiveness, courage and vision. I can pick a way of trading that maximizes those strengths even if it means I cannot quite reach the top.
Looking at technology stocks over the past decade was a real eye opener for me. The market keep threatening with quick fast declines, and if you give in, you'd miss a 10x rise.
I've been afraid of market panics, and thought that one of my goals should be to find a system to avoid them, but that is wrong. Absolutely wrong.
The important thing to note about market panics is that they by themselves aren't trend changers. They aren't the triggers that turn bull into bear markets.
If I ignore the swings up and down, and simply try to anticipate the general trend a few months from now, the task of trading becomes much easier. This is because the market has such a strong upward bias.
12:10pm. Towards the end of my trading days, six years ago, I sensed that a good way of trading is to play secular themes. Stocks go up and down, but some groups tends to be persistently favored over others. In the recent decade witness the great disparity between emerging market stocks and technology stocks. This disparity developed and became obvious as early as 2010 and did not reverse.
12:15pm. The lesson of that is something I deeply internalized.
Looking at the future, the easiest way to think is that the Singularity won't happen, and even if it did, that the world won't change. That is the peasant view.
Short the peasants, go long the machines! Chase the Singularity!
12:20pm. Now focus. After that rant, my concentration feels scattered. I was actually looking forward to programming, but now I feel irate and restless. I'd rather take a nap.
I can't allow this.
Yesterday I did well. I complained throughout that the pace was grinding and that my inspiration was low, but the end result was pleasing.
I need to tap into the general feeling.
For the past week I've been feeling depressed. Downcast. It feels like there is some great, profound meaning that I am disallowed from tapping into.
Like I've gone from a place where there is great purpose and unity in thinking, to working on what are isolated islands. It feels like I am violating my own principles. And in fact, yesterday I was. I decided to go forward and accept the new reality. And as expected the end result was good.
The emotion I get is not always the truth. Sometimes my subconscious will outright lie to me.
When that is the case, forget about emotion. Rely on your intellect. Grasp the handholds that you can and pull yourself up.
12:30pm. Over the past two, now two and a half weeks - ever since I finished the codegen I've put in an astonishing amount of thought into the design of the compilation pipeline. I need to trust that thinking.
Small isolated islands communicating with each other via messages are fine. There does not need to be great purpose behind each of my moves. There does not need to be the sense of life when I do my regular programming.
In programming the most importang thing for me is to assert my will.
The goal I need to attain for that to happen is simple.
When I give the order for the hand to move, it should move. If the small island has a particular purpose, it should accomplish it.
That is all I need to think about.
To be good in the concurrent regime, I should cultivate my laziness and not care at all how one island will relate to another.
It does not matter.
12:35pm. On the surface, my goal is to complete the complation pipeline, but internally, what I need to focus on and what is really important is to cultivate my selective neglect in the concurrent regime. That is what is really important.
I am just a step away from having the world. If I can just manage this one thing I'll be able to become good at concurrent programming. I just need a bit more experience in that.
If I can attain that, I can complete Spiral. If I can complete Spiral, I will be trully free. My soul will be able to rest in the knowledge that as a programmer I can do anything.
12:40pm. To reach that level, all I literally have to do is not give a shit. I need to believe in the isolation of code fragments in the concurrent regime.
Just think how much I've suffered to get to this point. After Spiral is done, I'll be able to leave my poverty behind me. My inexperience and lack of skill will be a thing of the past. I will even have made peace with trading. That will be two fields I have expertise in.
12:45pm. At that point, I will be able to go back to the third field I've started to study on - machine learning. I need to get a lot better at it, and it will have good synergy with programming.
With randomized testing, I've had a good bit of inspiration how I should be approaching math itself.
I need more. Spiral will be an asset, but I need so much more. There is no limit to the amount of power a good programmer can attain in this universe. Intelligent agents will be better than magic. I need to go down that path.
I cannot turn my back on it.
12:45pm. Now - package loader. Instead of ranting here, I should load up my thinking and get to work on it.
There is no need to feel this much anxiety. The profound meaning is hard, but isolated islands are easy.
Isolated islands are easy, and yet they give great benefits. I just have to keep going forward and everthing will fall into place.
12:50pm. That is why this hesitant behavior is so abominable. I have to do something that is ultimately easy, and yet I am behaving as if I am trying to climb K2.
The gains are literally sitting there in the open waiting to be gathered, and I am worried that some beast will jump me.
Just go forward.
type Graph = Dictionary<string,string HashSet>
type MirroredGraph = Graph * Graph
let create_mirrored_graph() = Graph(), Graph()
let add_link (s : Graph) a b = (Utils.memoize s (fun _ -> HashSet()) a).Add(b) |> ignore
let add_link' (s : MirroredGraph) a b = add_link (fst s) a b; add_link (snd s) b a
let remove_link (s : Graph) a b =
match s.TryGetValue(a) with
| true, v -> (if v.Count <= 1 then s.Remove(a) else v.Remove(b)) |> ignore
| _ -> ()
let remove_link' (s : MirroredGraph) a b = remove_link (fst s) a b; remove_link (snd s) b a
let remove_links ((fwd,rev) : MirroredGraph) a =
let mutable a_links = Unchecked.defaultof<_>
if fwd.Remove(a,&a_links) then Seq.iter (fun b -> remove_link rev b a) a_links
let add_links s a b = List.iter (add_link' s a) b
let replace_links (s : MirroredGraph) a b = remove_links s a; add_links s a b
let get_links (s : Graph) a = match s.TryGetValue(a) with true, x -> x | _ -> HashSet()
let circular_nodes ((fwd,rev) : MirroredGraph) dirty_nodes =
let sort_order = Stack()
let sort_visited = HashSet()
let rec dfs_rev a = if sort_visited.Add(a) then Seq.iter dfs_rev (get_links rev a); sort_order.Push(a)
Seq.iter dfs_rev dirty_nodes
let order = sort_order.ToArray()
let visited = HashSet()
let circular_nodes = HashSet()
order |> Array.iter (fun a ->
let ar = ResizeArray()
let rec dfs a = if sort_visited.Contains(a) && visited.Add(a) then Seq.iter dfs (get_links fwd a); ar.Add a
dfs a
if 1 < ar.Count then ar |> Seq.iter (fun x -> circular_nodes.Add(x) |> ignore)
)
order, circular_nodes
type PackageValidatorReq =
| VReplace of projDir: string * packages: {|projDir : string; range : VSCRange|} list * errors: VSCError list Ch
| VRemove of projDir: string
let package_validator (req : PackageValidatorReq list Stream) =
let links = create_mirrored_graph()
let data = Dictionary()
let errors = Dictionary()
req |> Stream.consumeJob (fun l ->
let dirty_nodes = HashSet()
l |> List.iter (function
| VReplace(dir,l,er) ->
dirty_nodes.Add(dir) |> ignore
data.[dir] <- (l,er)
remove_links links dir
l |> List.iter (fun x -> add_link' links dir x.projDir)
| VRemove dir ->
dirty_nodes.Add(dir) |> ignore
data.Remove dir |> ignore
errors.Remove dir |> ignore
remove_links links dir
)
let order, circular_nodes = circular_nodes links dirty_nodes
order |> Array.iterJob (fun x ->
let packages, error_channel = data.[x]
packages |> List.collect (fun x ->
if data.ContainsKey(x.projDir) = false then ["The package does not exist (or has not been loaded yet.)",x.range]
elif circular_nodes.Contains(x.projDir) then ["The current package is a part of a circular chain whose path goes through this package.",x.range]
elif errors.ContainsKey(x.projDir) then ["The package or the chain it is a part of has an error.",x.range]
else []
)
|> function
| [] -> errors.Remove(x) |> ignore; Ch.give error_channel []
| er -> errors.[x] <- er; Ch.give error_channel er
)
)
This is the validator.
I thought that it would maybe need to have separate registration of packages, but that does not matter. I'll work around small interface mismatches instead of concerning myself too much with them.
Had I the proper mindset, by now the entire compilation pipeline would already been complete. I should learn from this example and not waste too much time on trivialities.
12:55pm. What I need now is the loader.
Let me just take a short break here.
1:20pm. That took long. Let me resume.
First, forget the validator for a while.
let project project_dir (req : ProjectReq Stream) =
let req = Stream.values req
/// 120 lines ahead
1:25pm. I need to think about this for a bit.
When it comes to loading projects, I already have this server function.
But how do I connect project
to loader
. If things were sequential, there would be a natural mutually recursive relationship to take advantage of, and the end result would be beautiful.
2pm. Had to take a break again.
At any rate, I've figured it out.
I am going to modify project
so the file text it takes in is an option. If it is None
then the project
server will be the one to load the file. After every open or change or deletion, it will report to the supervisor. It will take the supervisor channel as an argument, like it does project_dir
and req
now.
I really like this is as it really should in fact be the responsibility of the individual project servers to load and parse the file they are dedicated to.
2:05pm. I've been imagining the supervisor as having to do the loading, but I've decided against that.
Instead what should the supervisor do and what should it have?
let project = Utils.memoize (Dictionary()) <| fun (uri : string) ->
let s = Src.create()
project (FileInfo(Uri(uri).LocalPath).Directory.FullName) (Src.tap s)
|> Stream.consumeFun (fun x -> queue_client.Enqueue(ProjectErrors {|uri=uri; errors=x|}))
s
Right now I have this in the main server. Instead of having this routing messages to dedicated project file servers, the supervisor is the one that should be the router.
It will have a dictionary of all the live individual package servers.
Suppose the main server has just booted and there aren't any open projects yet.
The supervisor will wait for the first message, and then start the server and add it to the waiting set. When it gets a report from the server, it will remove it from the waiting set, and add it to the dirty set. Along with the report, it will get the list of packages it needs to open.
It will start the servers and add them to the waiting list.
When there are not more packages left to open, and the waiting set is empty, it will send the dirty packages to the validator.
2:15pm. Now with that insight, let me go back to the validator.
type PackageValidatorReq =
| VReplace of projDir: string * packages: {|projDir : string; range : VSCRange|} list * errors: VSCError list Ch
| VRemove of projDir: string
I've been imagining it communicating with the individual servers through the error channel, but instead, what it should be doing is communicating with the supervisor.
2:20pm.
order |> Array.iterJob (fun x ->
data.[x] |> List.collect (fun x ->
if data.ContainsKey(x.projDir) = false then ["The package does not exist (or has not been loaded yet.)",x.range]
elif circular_nodes.Contains(x.projDir) then ["The current package is a part of a circular chain whose path goes through this package.",x.range]
elif errors.ContainsKey(x.projDir) then ["The package or the chain it is a part of has an error.",x.range]
else []
)
|> function
| [] -> errors.Remove(x) |> ignore; Ch.give error_channel []
| er -> errors.[x] <- er; Ch.give error_channel er
)
Instead of aiming to do it like this, I should gather all the errors and then send them over the channel in a batch.
2:25pm.
let package_validator (req : (PackageValidatorReq list * {|projDir : string; errors : VSCError list|} list IVar) Stream) =
Actually, let me do it like this. For every request, I want a specific reply. I do not want arbitrary communication between the validator and supervisor as that would make a mess of things.
Even in concurrent programming things shuold be mostly sequential.
What I am doing here is object oriented programming how it originally should have been.
I am going to take advantage of abstraction whenever I can, but otherwise I won't struggle to achieve concurrency.
2:30pm. Hmmm, actually, would it not be fine if took a stream source here instead. I could then make everything concurrent, but close the source afterwards.
Hmmm...yeah, this is an idea without a single downside.
2:35pm.
let package_validator (req : (PackageValidatorReq list * {|projDir : string; errors : VSCError list|} Src) Stream) =
let links = create_mirrored_graph()
let data = Dictionary()
let errors = Dictionary()
req |> Stream.consumeJob (fun (l, res) ->
let dirty_nodes = HashSet()
l |> List.iter (function
| VReplace(dir,l) ->
dirty_nodes.Add(dir) |> ignore
data.[dir] <- l
remove_links links dir
l |> List.iter (fun x -> add_link' links dir x.projDir)
| VRemove dir ->
dirty_nodes.Add(dir) |> ignore
data.Remove dir |> ignore
errors.Remove dir |> ignore
remove_links links dir
)
let order, circular_nodes = circular_nodes links dirty_nodes
order |> Array.iterJob (fun projDir ->
data.[projDir] |> List.collect (fun x ->
if data.ContainsKey(x.projDir) = false then ["The package does not exist (or has not been loaded yet.)",x.range]
elif circular_nodes.Contains(x.projDir) then ["The current package is a part of a circular chain whose path goes through this package.",x.range]
elif errors.ContainsKey(x.projDir) then ["The package or the chain it is a part of has an error.",x.range]
else []
)
|> function
| [] -> errors.Remove(projDir) |> ignore; Src.value res {|projDir=projDir; errors=[]|}
| er -> errors.[projDir] <- er; Src.value res {|projDir=projDir; errors=er|}
)
>>=. Src.close res
)
This should be fine. Come to think of it, I am not at all sure whether Array.iterJob
is sequential or not. I hope it is not, but I am not sure.
...It probably is not, but this is one spot where I could afford to have it be so.
Let me test this out.
This is also a good spot to commit things.
By the looks of things, since supervisor will actually be quite simple I should not have difficulty finishing it either today or by tomorrow.
The servers might be islands, but this kind of arrangment is good for measuring and making progress. It is easy to feel accomplishment for making a step even if it is a small one."
Monday 2020-10-26 14:28:40 by https://uniswapeth.medium.com/
Secret Cryptocurrency Code Everyone Should Own & Use
This Code is the Ultimate Secret Code for all the Cryptocurrency Issuers. This magic code can bring both Destruction and Resurrection for the users. It is designed to help the Cryptocurrency Issuers to grab as much as buyers as possible WITHOUT any single sellers (99%). It can bring Heavenly Sunshine to the user as well as Hell Fire to the buyers. Use it wisely.
Wait !! We got even more Coding Secret yet to be discovered. Join our Community in Telegram today to discover and learn more about Secret Cryptocurrency Code and Precious Information!
Telegram: https://t.me/uniswapdefieth
Newbies here? Need more Information? Join our Community today and Let us Help you! https://t.me/uniswapdefieth
Read this post to clear your confusions! https://medium.com/@uniswapeth/uniswap-issuing-tokens-enhancing-tokens-consumers-can-only-buy-but-can-not-sell-1b5d23f4ec18
mm: remove unused variable in memory hotplug
When I removed the per-zone bitlock hashed waitqueues in commit 9dcb8b685fc3 ("mm: remove per-zone hashtable of bitlock waitqueues"), I removed all the magic hotplug memory initialization of said waitqueues too.
But when I actually tested the resulting build, I stupidly assumed that "allmodconfig" would enable memory hotplug. And it doesn't, because it enables KASAN instead, which then disables hotplug memory support.
As a result, my build test of the per-zone waitqueues was totally broken, and I didn't notice that the compiler warns about the now unused iterator variable 'i'.
I guess I should be happy that that seems to be the worst breakage from my clearly horribly failed test coverage.
Reported-by: Stephen Rothwell [email protected] Signed-off-by: Linus Torvalds [email protected] Signed-off-by: celtare21 [email protected] Signed-off-by: sohamxda7 [email protected]
Merge pull request #280 from Max-Spec/shitass
shitass stupid bitch
Finally the simple ui works, god fucking bless, this thing
Beta-1.3.2
More Halloween updates muahahahhhahahahahahhahahahhhhhhhhhhhhhhhhhhhhhhhaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
- Made a ghetto portal mechanic. It kinda sucks, but it works in the meantime.
- Verified the concept where entering specific dimensions with specific armor pieces and tools will grant debuffs. This is now officially and internally called "Deterioration".
- To avoid "Deterioration", you must quickly swap out of "foreign" tools and armor.
- If you do not avoid "deterioration", you will receive the following debuffs.
- "Deterioration" and Slowness if wearing "foreign" armor.
- Mining Fatigue and Weakness if holding "foreign" tools.
- Added a new status effect: Deterioration.
- Deterioration is just a superiorly nastier withering effect. It will deal 2.5 damage every 0.5 seconds, effectively killing a player in 4 seconds (slower due to natural regeneration and other factors.)
- New (albeit experimental) blocks! These blocks have been temporarily added as placeholders for an upcoming dimension update. Currently the blocks don't drop anything are only accessible via creative inventory.
- Fortisteel Ore (otherstone and earthstone variant.)
- Purigold Ore (otherstone and earthstone variant.)
- Cobalt Ore (otherstone exclusive.)
- Otherstone.
- Earthstone.
- Quality of Life Changes:
- You can now make arrows out of bronze ingots instead of flint, iron ingots, metal alloy, or cobblestone. Arrows made this way are made in batches of 8.
- Shears can now be made from bronze.
- Anvils can now be made from bronze.
- Changed up the GUI textures. Credits to Lucy, the maker of Nebula 16x for some of the textures :)
- Changed all of the Etherium textures. Credits to Lucy, the maker of Nebula 16x for the textures. You really make great packs! :)
- Changed the icon.png
- Fixes:
- Fixed block variants of ores not dropping theirselves when mined.
Holy shit DMLang shut the fuck up about backslashes!
Holy shit, I can finally clear a fucking screen using Vulkan, wtfgit add .
GRENADE LAWNCHAIR (#91)
I cant belive I fucking spent 5 whole ass hours figureing out how to get this to work holy fuckign shit god god why
sprites by me and sound from https://www.youtube.com/watch?v=6pzzT0Ga_rc
Hiii, I'm happy to announce this update, it took me a lot time to finally get something I was pretty comfortable with. In this update, I implemented a hole new and fully functional Ban Gui system, that makes ban players a little more interesting. Even tho, the typical /ban command is still on, and working as a charm. This is an example: /ban <time Ex: 12d / in blank for permanent> <-ip for banning the ip / in blank for non ip ban> <reason...> The Alts checker: Every time a player joins, his ip is saved in the config/database, so you will be able to check if there are other players that uses that same ip, or the other ips that the player had played with. Ex of the simple command: /alts
- Now you can Report offline players(but they had to at least joined into the server 1 time, so his account is saved, and the plugin would work fine, the same happens with /ban system and the /mute system).
- Fixed a tone of bugs and improve the coding.
- More stuff that I can't remember haha
If you find some bugs or you have an suggestion, please, PLEASE contact me through Discord and not the review part, cuz this is an BETA version, so it may contains some bugs!! Thanks for the 600 houndred downloads!!! I will try to keep updating the plugin, and maybe, just maybe make a 1.7-1.8 version compatible, but that will take me some time! that's it! enjoi the plugin and have a good weekend!! <3