It would seem that csv files imported into a Base database are read-only, and that is by design. Is there a work-around to remove the read-only? Or another way to get csv data into a new database?
Are you actually IMPORTING your CSV file, or just CONNECTING TO it (via a Calc sheet derived from it)?
Anyway, I find that the most reliable way to do it is to import your CSV data into a Calc sheet first and then do a PASTE into a pre-setup BASE table with columns matching your Cal sheet columns.
See these two posts for more info:
One can copy from Calc and paste into Base without setting up the Base table ahead of time. The base paste wizard gives one choices about how the Calc paste data is handled.
I have a table defined as follows:
‘CREATE TEXT TABLE “podcast_sqlite_2” (“recid” INT PRIMARY KEY, “missing” BOOLEAN, “transfer” BOOLEAN, “removed” BOOLEAN, “date” DATE, “author” VARCHAR(32), “podcast” VARCHAR(36), “title” VARCHAR(120), “directory” VARCHAR(96), “filename” VARCHAR(90)’.
It is populated by a csv file via the following:
SET TABLE “podcast_sqlite_2” SOURCE “podcast_info_2.csv;fs=,” (the ;fs=, is probably unnecessary).
The resulting table can be modified, because it has a primary key (which should not have duplicate values).
Separately, my experience is that an integer field in a table loaded from a csv file can be made into a primary key–after using SET TABLE SOURCE OFF, and then editing the table structure. After doing that, one reconnects to the csv file and the table is modifiable–because it now has a primary key. There should be no duplicate values in that integer field.