Summary

Sorry for the lazy blog post today. I just ask GPT3 for some subtopics of ancient roman law as I am looking for a cool word to use. I would like to know what these words mean, so I use GPT3 for that too.

Subtopics of Ancient Roman Law

These were generated by GPT-3.

1
2
3
4
5
aedilitas advocatus auctoritas augur auspex
caupona cena clientela contio domus ius ludos
ministra mos ora otium praetor quaestio res
mancipi sacerdos status suovetaurilia tabella
tribunus plebis via vir

GPT-3 Language detection and translation

language
Latin

English translation:

1
2
3
4
5
aedile advocate authority augur inn auctioneer
dinner clientele assembly law games officer
custom feast judge quest mancipium priest
status sacrifice suovetaurilia table plebeian
tribune road man

This are not exactly lined up perfectly, but the text generator did a great job.

Latin English
aedilitas aedile
advocatus advocate
auctoritas authority
augur augur
auspex inn
caupona auctioneer
cena dinner
clientela clientele
contio assembly
domus law
ius games
ludos officer
ministra
mos custom
ora feast
otium judge
praetor quest
quaestio mancipium
res mancipi priest
sacerdos
status status
suovetaurilia sacrifice
tabella suovetaurilia
tribunus plebis table
via plebeian tribune
vir road
man

Prompts

Translate anything to English

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
title: "Translate to English"
prompt-version: 1
doc: "This prompt translates text in any world langauge to English"
prompt: |+
    The following passages are translated into English.
    ###
    La clase Monad esta disenada para
    describir tipos/estructuras de datos en
    los que sea necesario manejar un contexto
    de forma transparente al usuario.
    Original language: Spanish.
    English translation:
    The Monad class is designed to describe
    data types / structures in which it is
    necessary to handle a context in a
    transparent way to the user.
    ###
    Puis-je avoir une baguette s'il vous plaît?
    Original language: French
    English translation:
    Could I have a baguette please?
    ###
    Der Begriff deutsch als Bezeichnung für das Volk der Deutschen, die deutsche Sprache bzw. Deutschland verfügt in den verschiedenen Sprachen der Welt über untereinander ähnliche wie auch äußerst unterschiedliche Wörter.
    Original language: German
    English translation:
    The term German as a term for the people of the Germans, the German language or Germany has similar and extremely different words in the various languages of the world.
    ###
    <1>
    <:pp>Original language:    
engine: "davinci"
temperature: 0.5
max-tokens: 200
top-p: 1.0
frequency-penalty: 0.5
# If I make presence-penalty 0 then it will get very terse
presence-penalty: 0.0
best-of: 1
stop-sequences:
- "###"
inject-start-text: yes
inject-restart-text: yes
chomp-start: on
chomp-end: off
show-probabilities: off
vars:
- "text"
examples:
- "Pero, que es el contexto?"
external: ""
conversation-mode: no
filter: no
# Keep stitching together until reaching this limit
# This allows a full response for answers which may need n*max-tokens to reach the stop-sequence.
stitch-max: 0
needs-work: no

English to Japanese

A prompt for translating to and from specific languages could be more reliable in particular cases.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
title: "English to Japanese"
prompt-version: 2
prompt: |+
    Translate to Japanese.

    English: Give me a home among the gumtrees.
    Japanese: ガムツリーの中に家をください。
    ###
    English: Could I have a baguette please?
    Japanese: バゲットを頂けますか?
    ###
    English: See you later!
    Japanese: じゃあまたね!
    ###
    English: <1>
    Japanese:    
engine: "davinci"
temperature: 0.5
max-tokens: 100
top-p: 1.0
frequency-penalty: 0.0
# If I make presence-penalty 0 then it will get very terse
presence-penalty: 0.0
best-of: 1
stop-sequences:
- "###"
inject-start-text: "\n"
inject-restart-text: "\n"
show-probabilities: off
vars:
- "english prose"
examples:
- "I love you."
external: ""
filter: no
# Keep stitching together until reaching this limit
# This allows a full response for answers which may need n*max-tokens to reach the stop-sequence.
stitch-max: 0