Quotes! Quotes! Quotes! Part 5

More Trippy Quotes generated from my new Inspirational Quotes 774M model

Photo by Rodion Kutsaev on Unsplash

Curator’s note:

Here are some more trippy quotes my latest “Inspirational Quotes” 774M GPT-2 model generated recently. Enjoy!


It helps you stopped deteriorating yourself.


Although dead waves Hammer ****s nowadays, Think how you’d shine Offense Science!


Do not regret the memories of your past nor the future mattered.


Believe people are more tolerant but loving people have zero tolerance.


If all mankind had been wisdom, no man would reach without sacrifice.


There aren’t time for nice we do not tolerate cruel consequences.


Whatever I can do or dream I can, begin it.


Such comfort zone


Depanda is better than neutrality.


Some tokens things are purely financial

[END TRANSMISSION]


Editor’s Notes

Written by AI, using my random-temp script, and using my quotes-774M model, which is a fine-tuning of gpt-2 774M. Chosen from among the other recently generated quotes by me, a human.

Edits

Censored a word. Deleted a subphrase on another that it copied from the dataset, sneaking in under the plagiarism detector’s word limit.

Plagiarism Checked

Plagiarism pre-screened before selection with Plagiarism-Basic against the dataset for duplicate strings of mostly length 3 or sometimes 8 words, in some cases.

GPT-2 Settings

Quote 1

{
  "return_as_list": true,
  "length": 20,
  "top_k": 80,
  "top_p": 0.9,
  "truncate": "<|endoftext|>",
  "nsamples": 1,
  "batch_size": 1,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.5000000000000007
}

Quote 2

{
  "return_as_list": true,
  "length": 20,
  "top_k": 80,
  "top_p": 0.9,
  "truncate": "<|endoftext|>",
  "nsamples": 1,
  "batch_size": 1,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.5000000000000007
}

Quote 3

{
  "return_as_list": true,
  "length": 50,
  "top_k": 80,
  "truncate": "<|endoftext|>",
  "nsamples": 300,
  "batch_size": 50,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 0.9700000000000004
}

Quote 4

{
  "return_as_list": true,
  "length": 50,
  "top_k": 80,
  "truncate": "<|endoftext|>",
  "nsamples": 300,
  "batch_size": 50,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.3400000000000007
}

Quote 5

{
  "return_as_list": true,
  "length": 30,
  "top_k": 40,
  "truncate": "<|endoftext|>",
  "nsamples": 1,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.4000000000000006
}

Quote 6

{
  "return_as_list": true,
  "length": 50,
  "top_k": 80,
  "truncate": "<|endoftext|>",
  "nsamples": 300,
  "batch_size": 50,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.2300000000000006
}

Quote 7

{
  "return_as_list": true,
  "length": 50,
  "top_k": 80,
  "truncate": "<|endoftext|>",
  "nsamples": 300,
  "batch_size": 50,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 0.9900000000000004
}

Quote 8

{
  "return_as_list": true,
  "length": 20,
  "top_k": 80,
  "top_p": 0.9,
  "truncate": "<|endoftext|>",
  "nsamples": 1,
  "batch_size": 1,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.2200000000000004
}

Quote 9

{
  "return_as_list": true,
  "length": 20,
  "top_k": 80,
  "top_p": 0.9,
  "truncate": "<|endoftext|>",
  "nsamples": 1,
  "batch_size": 1,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.4300000000000006
}

Quote 10

{
  "return_as_list": true,
  "length": 20,
  "top_k": 80,
  "top_p": 0.9,
  "truncate": "<|endoftext|>",
  "nsamples": 1,
  "batch_size": 1,
  "run_name": "model-quotes-774M-run1",
  "prefix": "<|startoftext|>",
  "temperature": 1.3600000000000005
}