Stealth Box

It’s not caught on the tracker.

Photo by Sam Moqadam on Unsplash

PROMPT: All fingers and thumbs

All fingers and thumbs tied up behind his head as he brought the plane down on the concrete. Bischofsky’s studio wheels spun as the thrust raised the plane until they were in position to lock into the vertical, and then swung them back around to the fifth level from the bottom.

“Dolph’s got a touch screen set up in the wall,” said Schuyler as the plane came to rest, rolling to a stop. “There’s a head in the back.”

“Thanks.” Bischofsky laid his hand on the box’s surface. “It’s not caught on the tracker.”

“It will.” The chair’s metal frame creaked against the concrete as Bischofsky worked on the seat. “Hooks up the setting. You got the set around here, you’ll have a direct feed into the hangar control center.”

“Okay.” He settled himself in the chair. “What’re you going to set up?”

“You said we’re to meet Bischofsky.” An odd note of pride crept into her voice. “So when he shows up—you’re going to get a full defense.”

“And me in the crew,” said Schuyler. “I’ll be on the screen with them.”

“You got it.” She lifted her hand and looked down at the floor, then back up at him. “Right. You got it.”

He nodded. “Be there.”

She turned and headed for the train. Schuyler, slowly moving toward the door, followed her.

[END TRANSMISSION]


Editor’s Notes

Written by AI, using my multi-temp script, and using my cyberpunk fiction model, which is a fine-tuning of gpt-2 774M. Chosen from among the other recently generated stories by me, a human.

Prompt

“All fingers and thumbs”

Came from one of these places, which the script chose at random from a list of phrases:

Edits

No edits :)

Title

Title was human-derived by me from the generated text

Plagiarism Checked

Plagiarism checked using random phrase searches in my editor against the dataset. The cyberpunk model currently makes my computer run out of ram while running, even in my new rust plagiarism workflow, and needs to be refactored to run it in chunks, since the dataset is too large. I’m currently working on it.

GPT-2 Settings
{
  "return_as_list": true,
  "length": 500,
  "top_k": 500,
  "top_p": 0.9,
  "truncate": "<|endoftext|>",
  "nsamples": 1,
  "batch_size": 1,
  "run_name": "model-cyberpunk-run1",
  "prefix": "All fingers and thumbs",
  "temperature": 1.0
}