Assume we have a system which includes a relational database (Say, MySQL or Oracle, for instance) which is populated by a scheduled job which consumes data from an external API. Also assume there are no guarantees for the size of data (e.g. size of a particular value of a json) coming from the API.
In this kind of a scenario, what is the best practice to determine a proper column size for a particular attribute populated using data coming from the API? Is it an apt way to dynamically (programmatically) adjust the column size based on incoming data, or do we assign a relatively larger size to accommodate incoming larger data?
Thank You.
[A]ssume there are no guarantees for the size of data (e.g. size of a particular value of a json) coming from the API.
In this kind of a scenario, what is the best practice to determine a proper column size for a particular attribute populated using data coming from the API?
In Oracle, you can use a CLOB
data type.
Say, [...] Oracle, for instance
In Oracle, a VARCHAR2
column is limited to 4000 bytes in the SQL scope. If you have no guarantees about the size of the data then use a CLOB
. In Oracle 19c a CLOB
datatype has a limit of 4GB * DB_BLOCK_SIZE
initialization parameter (which give a total size of 8 TB to 128 TB). If you are exceeding that limit then you should seriously consider re-working how you are consuming the data.