I asked OpenSSL to generate a dummy Ed25519 private key for me and got this output:
https://lapo.it/asn1js/#MC4CAQAwBQYDK2VwBCIEIJCO9eKZEUOmL9CGfecuKqvYU_hLTAFXwl0Ipd8xNXbP
It decodes to the following:
SEQUENCE (3 elem)
INTEGER 0
SEQUENCE (1 elem)
OBJECT IDENTIFIER 1.3.101.112 curveEd25519 (EdDSA 25519 signature algorithm)
OCTET STRING (34 byte) 0420908EF5E2991143A62FD0867DE72E2AABD853F84B4C0157C25D08A5DF313576CF
OCTET STRING (32 byte) 908EF5E2991143A62FD0867DE72E2AABD853F84B4C0157C25D08A5DF313576CF
How does a DER parser know when an OCTET STRING contains DER within it? The first OCTET STRING starts with 04 20 indicating a 32-byte OCTET STRING, and the "parent" starts with 04 22 specifying an OCTET STRING that includes the "child" header 04 20. There doesn't seem to be a flag saying that "there is a DER-encoded subobject inside here".
OpenSSL dumps it like this:
myria@MYRIA:~$ openssl asn1parse -i -in dummy-private.pem
0:d=0 hl=2 l= 46 cons: SEQUENCE
2:d=1 hl=2 l= 1 prim: INTEGER :00
5:d=1 hl=2 l= 5 cons: SEQUENCE
7:d=2 hl=2 l= 3 prim: OBJECT :ED25519
12:d=1 hl=2 l= 34 prim: OCTET STRING [HEX DUMP]:0420908EF5E2991143A62FD0867DE72E2AABD853F84B4C0157C25D08A5DF313576CF
indicating that OpenSSL just sees the "parent" OCTET STRING.