Toggle navigation
PxPlus User Forum
PxPlus User Forum
»
Main Board
»
Discussions
»
Language
»
Large records in file
Likes
« previous
next »
Print
Pages: [
1
]
Author
Topic: Large records in file (Read 1315 times)
michaelgreer
Diamond Member
Posts: 124
Large records in file
«
on:
October 25, 2021, 10:50:18 AM »
I am trying to create a file with record size >32K. I am using this:
keyed "myfile",[1:1:16:"+"],0,-50000,opt="X"
This throws an error 41. What am I mssing?
Logged
Mike King
Diamond Member
Posts: 3773
Mike King
Re: Large records in file
«
Reply #1 on:
October 25, 2021, 12:57:26 PM »
Set the record size to -30000 and use the OPT="X" (as you have specified).
The -30000 indicates that internally the system will split the records into 30000 byte chunks each linked together due to the OPT="X".
Logged
michaelgreer
Diamond Member
Posts: 124
Re: Large records in file
«
Reply #2 on:
October 26, 2021, 04:48:28 PM »
Thanks Mike. Not sure I would ever have figure *that* out.
Now, I just did keyed "myfile",[1:1:16:"+"],0,-30000,opt="X". I dimmed a$(35000). When I write to a channel with myfile opened - write(lfo)*,a$ - I get an error 1.
«
Last Edit: October 26, 2021, 04:52:36 PM by michaelgreer
»
Logged
Mike King
Diamond Member
Posts: 3773
Mike King
Re: Large records in file
«
Reply #3 on:
October 26, 2021, 05:09:18 PM »
What version PxPlus are you using? I just tried this here and it worked on all supported release (PxPlus 2018 v15 -> PxPlus 2021 v18).
PxPlus-2018 Web (Ver:15.10/MS-WINDOWS) Serno:1510-001-xxxxxxx
(c) Copyright 2005-2018 PVX Plus Technologies Ltd. (All rights reserved)
Website:
http://www.pvxplus.com
->keyed "myfile",[1:1:16:"+"],0,-30000,opt="X"
->dim a$(35000)
->open (1) "myfile"
->write (1) *,a$
->
Logged
michaelgreer
Diamond Member
Posts: 124
Re: Large records in file
«
Reply #4 on:
October 26, 2021, 05:32:17 PM »
Well now, that may be the issue. I am on 10.20. Not supported there?
Logged
Mike King
Diamond Member
Posts: 3773
Mike King
Re: Large records in file
«
Reply #5 on:
October 26, 2021, 05:36:58 PM »
The combination of Extended record size and auto-increment (the "+" in the key definition) is not available on that version.
Logged
michaelgreer
Diamond Member
Posts: 124
Re: Large records in file
«
Reply #6 on:
October 27, 2021, 03:00:52 PM »
If I nix the autoincrement can I get the extended record size?
Logged
Mike King
Diamond Member
Posts: 3773
Mike King
Re: Large records in file
«
Reply #7 on:
October 27, 2021, 05:21:58 PM »
That should allow it to work although we would STRONGLY recommend you use a newer version of PxPlus as version 10.20 is over 9 years old and there have been many changes, fixes and enhancement since that time.
Logged
michaelgreer
Diamond Member
Posts: 124
Re: Large records in file
«
Reply #8 on:
November 01, 2021, 03:31:23 PM »
I have successfully created this file. Now, when inserting large records via ODBC (v 5.10.0002)a 3rd party is running into this error:
“<eb1>Expected lexical element not found: <identifier>
State: 37000; Native: 1015; Origin: [PxPlus][ODBC Driver]</eb1>”.
Is this a limitation of the ODBC in play here?
Logged
Mike King
Diamond Member
Posts: 3773
Mike King
Re: Large records in file
«
Reply #9 on:
November 01, 2021, 09:08:26 PM »
Why are you using version 5.10 of the ODBC?
It from 2012 and has not been supported for a number of years.
The current ODBC driver is version 7.
Logged
michaelgreer
Diamond Member
Posts: 124
Re: Large records in file
«
Reply #10 on:
November 02, 2021, 10:33:25 AM »
No particular reason other than it has been in place all these years.
Logged
Print
Pages: [
1
]
« previous
next »
PxPlus User Forum
»
Main Board
»
Discussions
»
Language
»
Large records in file