The like and get likes thread II


(Ian Morbius) #5301

The Return of Tail o’ the Pup!


(Ian Morbius) #5302

In the year 1984.

David Bowie - 1984 (Written 1973.)


(Yiole Gionglao) #5303

Beddy time has come for me… so I say: nighties lovelies!

Also: Ryugu and Hayabusa 2 are back from the other side of the Sun!

Now that there’s good communications, Hayabusa 2 and its pets will keep maeking da szienz.


(Nana Skalski) #5304

Tandy is dandy. :ok_hand:

Author of that channel even makes games on old computers! :+1:

I like this channel. So much lore about computers and how to repair them and renovate. When my C64 will be broken, I will repair it watching this channel probably.

We will live as in science fiction.


(LordOdysseus) #5305

Timezone lovelies. :heart:


(LordOdysseus) #5306

https://www.qwertee.com/product/metal-fest

When I was 5 years old, I got a coal from Santa…
The next year I decided to make him pay for it and poisoned his cookies. Somehow, the bastard found out and killed my dad


(Nana Skalski) #5307

That damned bastard Santa! :confounded:


(lilsteel) #5308

… is an ID10T problem.

What is an ID10T problem?
The ID of an ID10T problem is 10T, as in 10 ton hammer.

You don’t need to see my identification



They were on their way to the (Mos Eisley Chalmun’s) Cantina for a drink (or something).

http://info.cern.ch


(Nana Skalski) #5309


(Yiole Gionglao) #5310

Nice t-shirt, some of those are really heavy.


(Yiole Gionglao) #5311

Quality Construction ™

Looks better than EVE’s chat, though -he didn’t hole the wall and fell outside! :rofl:


(Nana Skalski) #5312


(lilsteel) #5313

Do you object orientation or can you do it yourself?, programming.

(Yiole Gionglao) #5314

Off to bed! Nighties lovelies!

Also: a classic :rofl:


(Nana Skalski) #5315

Levitating dog. :joy:

Good one. Did not seen that one. Nini. :sleepy:


(RA1N D33R) #5316

OMAE WAN NO SHINDERU


(lilsteel) #5317

Also, computers and automated car driving programs are not moral, and should not be intended to be used to take moral decisions in case of accidents.
It’s most likely that the program will not have the time to decide which action to take which will result in the loss of life, which loss of life will be decided under the control of the self-driving program.

Self-help mental health books are also not meant to be interpreted as a method for self-driving automated car to use to decide which is the best target to kill in case of accident.

No , in fact, it should not try to kill anyone, and this should happen only by accident, without the control of the system, and it should not be programmed for the machine to allow itself to do so.
In other words, the only time it could occur, is when in fact, the system has lost control of security, enough to the point as to end up causing death by accident.

Those programs are not intended for suicide bombers, and they also are not intended to detect who they are.

However, if it detects it, and when it does, it should or could be able to solve the source of the problem.


(LordOdysseus) #5318

Timezone lovelies. :heart:

Today is the last day of 2018.


(Yiole Gionglao) #5319

Don’t worry, I think they have plans to start a new year right tomorrow. :thinking:


(Nana Skalski) #5320


Complete whiteout.

Timezone everyone :kissing_heart: