2,539 questions
0
votes
0
answers
45
views
How to BULK INSERT hex strings into a VARBINARY column in Azure SQL (from CSV) without staging?
I am loading data from Parquet into Azure SQL Database using this pipeline:
Parquet → PyArrow → CSV (Azure Blob) → BULK INSERT
One column in the Parquet file is binary (hashed passwords).
PyArrow CSV ...
0
votes
1
answer
168
views
Is there a way in Laravel to bulk-insert multiple records and get Eloquent models (all inserted fields) returned? [duplicate]
I have a scenario where I need to insert multiple records at once.
I want to achieve all two:
Bulk insert multiple records at once (performance).
Return Eloquent model instances for the inserted ...
0
votes
0
answers
125
views
SQL Server BULK INSERT Delimited Text File with over 500 columns
I'm trying to bulk load an ascii pipe-delimited text file with 537 varchar columns into SQL Server 2019 (v15.0.4390.2) where I only need 276 of those columns. Regardless of the methods I have tried, I ...
1
vote
0
answers
67
views
Extremely slow DB insert using Turbodbc
I have built Turbodbc 5.1.2 from source with simdutf 7.3.0, Python 3.11. When trying to insert 150,000 rows of 46 columns to a MySQL 8.0 InnoDB table, Turbodb takes about 190s, compared to 15s with my ...
1
vote
1
answer
103
views
How can I make my form to have multiple arrays [duplicate]
Please I am trying to get this structure from a HTML form:
bulk_data: [
{
bank_code: "044",
account_number: "1234567832",
amount: 69000,
...
1
vote
0
answers
50
views
GridDB Java API bulk insert: why is put() 10× slower than documented performance benchmarks?
I’m evaluating GridDB Community Edition 5.3 on macOS (running the Linux container in Docker Desktop) for a time-series IoT workload.
The official Performance Benchmarks white-paper claims ~140 k rows/...
0
votes
0
answers
31
views
How to handle auto-increment gaps in OceanBase MySQL mode when bulk inserting
Environments
OceanBase Community 4.2.1 (MySQL mode)
MySQL version 5.7
We're migrating from MySQL 5.7 to OceanBase 4.2.1 (MySQL compatibility mode) and observing different auto-increment behaviors ...
-4
votes
1
answer
67
views
INSERT multiple rows from SELECT returns error "This is not permitted" [closed]
I have two tables of same structure and want to insert rows from tableA into tableB.
The following example works fine:
INSERT INTO tableA (uniqueColA, colB, colC)
SELECT uniqueColA, colB, colC ...
1
vote
2
answers
69
views
Bulk Insert fails when table dropped and created with changes
I am designing an ETL process where a .CSV file is loaded into a SQL Server table.
The table only contains a single 'load' of data at any one time. At the start of the process the table is dropped and ...
0
votes
0
answers
48
views
How to bulk import entities with Doctrine?
How to bulk import entities with Doctrine ?
Business requirements
I am using Doctrine 3 to bulk import a list of entities from a CSV.
Each row creates 100 entities~ and I flush it each 10 rows.
All ...
1
vote
1
answer
187
views
Bulk data processing for high volume records using platform events and Queueable class in Salesforce
We receive high volume of Account & Contacts through external service (approx. 2M per day). We need to load them in Salesforce with fire and forget type of integration. Please share your thoughts ...
2
votes
1
answer
165
views
SQL BULK INSERT Format Issues
I'm using SQL Server and Management Studio.
I was getting data conversion errors when performing a BULK INSERT into a table, that I could not resolve and switching to using a format file allowed data ...
2
votes
1
answer
200
views
How can I use the gorm library to bulk upsert records into a postgres DB and return the IDs of those that are inserts and not updates?
Although this can be done with a raw query, I would like to bulk insert a small number of records and handle upsert conflicts through the gorm Create API. I need to return the IDs of records that are ...
0
votes
0
answers
131
views
BULK INSERT command SQL Server skip empty rows
I am creating a process that will load a fixed width file into SQL Server. I have used the BULK INSERT and a format file and that works in most cases.
For 2 of my files, most of the rows are ...
0
votes
0
answers
24
views
Move after_commit notification to background job in rails while using activerecord#import [duplicate]
I have a Rails model with an after_commit callback that sends a notification to users. However, this operation takes too long, so I want to move it to a background job.
Current Code:
I create or ...
0
votes
1
answer
334
views
sqlalchemy : This result object does not return rows. but console does
i'm fairly new to python with sql-alchemy and just want to insert a bunch of data and receive some values back.
my code looks something like this:
statement = text("""
with ...
-7
votes
1
answer
100
views
Potential data loss during `Insert into select from` Postgres query
I have a staging table in PostgreSQL and use a simple INSERT INTO ... SELECT ... query to copy data from the staging table to the final table. However, I occasionally encounter data loss issues after ...
1
vote
1
answer
262
views
Does EFCore.BulkExtensions disable foreign key checks when SetOutputIdentity = false?
I am using EFCore.BulkExtensions for bulk insert, and I want to confirm how the library handles foreign key constraints during these operations.
Does EFCore.BulkExtensions disable foreign key checks ...
0
votes
0
answers
53
views
Bulk Insert task can't open file
I have a text file that I'm trying to import through SSIS (running in Visual Studio 2022) with a Bulk Insert Task. I've used it successfully from the same package for other files, but one set of files ...
0
votes
1
answer
168
views
SQL Bulk Insert - how retain empty string values
I am using the T-SQL Bulk Insert command to insert large text files into tables. The tables have many nvarchar columns, I found that if the text file has empty string values for any nvarchar column, ...
0
votes
0
answers
71
views
Batch Scripting in Nifi through Custom Groovy Script
I want to fetch the flowfiles from the input stream in bulk and create the one bulk insertion query for Clickhouse.
I am currently creating the the one insertion query for each document of mongoDB. ...
0
votes
1
answer
543
views
SQL Server: BULK INSERT fails with 'You do not have permission to use the bulk load statement' on SQL Login but works with Windows Login
I'm trying to enable BULK INSERT for a SQL Server login, but I keep encountering this error:
You do not have permission to use the bulk load statement
Here's what I've done so far:
1. Windows user ...
0
votes
1
answer
248
views
How to bulk load ~1M of JSON files into Redis?
Here is my current code in bash (based on https://redis.io/docs/latest/develop/use/patterns/bulk-loading):
head=$(redis-cli -h $redis_server get .git-head)
if [[ ! $head ]]; then
redis-cli -h $...
0
votes
2
answers
246
views
Bulk insert into a postgres table that has an array column with parameterized query
I have the following table:
┌────────────────┬─────────────────────────────┬───────────┬──────────┬──────────────────────────────────────────────────────────────────┐
│ Column │ ...
1
vote
1
answer
112
views
Uipath connect to SQL Server and use bulkinsert - get System.Exception error
I have an Excel file with 750,000 records. I first use read range to load it into a data table.
To avoid issues with large volumes of data, I used a For Each loop to split the data and perform bulk ...
0
votes
0
answers
74
views
Loading data from file to database, file has field which is primary key, how to implement hibernate bulk inserts. can't have auto generated Id
Entity looks like this.
@Entity
@Table(name = "TRANSACTION")
@Data
@AllArgsConstructor
@NoArgsConstructor
public class Transaction {
@Id
@Column(name = "ACCOUNT")
private String ...
0
votes
1
answer
345
views
Unique Constraint Failed In Upsert When Calling bulk_create with update_conficts
Im facing a unique constraint failed error with django. The objective of the api is, eithering creating or updating marks of the student, based on subject variation, exam_results and in bulk. For that,...
-1
votes
1
answer
89
views
Insert with values from existing table
I need to move some values based on expiration date, but unfortunately because the tables are not one to one in number of records I receive an error message when my query is run. I need SQL to be able ...
0
votes
1
answer
553
views
How to retrieve ids of inserted or updated rows after "INSERT ON DUPLICATE UPDATE" MySQL?
I'm storing players of my game based on their level :
CREATE TABLE IF NOT EXISTS Player(
id INT UNSIGNED NOT NULL AUTO_INCREMENT,
name VARCHAR(255) NOT NULL UNIQUE,
level TINYINT UNSIGNED ...
1
vote
0
answers
207
views
How to Optimize Bulk Insert of Related Records in Entity Framework Core Without Using EFCore.BulkExtensions?
I am working on a project where I need to insert a large number of related records into multiple tables using Entity Framework Core. The insertion process involves at least three tables: Question, ...
0
votes
0
answers
56
views
Entity Framework Core's BulkInsert doesn't care about nvarchar column length
I'm trying to insert some records with DbContext.BulkInsert, and I have a column that's nvarchar(50), but as soon as I try adding a record where this column is populated with 21 characters it breaks, ...
0
votes
1
answer
61
views
Removing/Disabling the SQL: INSERT INTO statement in SQL Alchemy bulk insert
I am trying to insert more than 10K records in postgres sql database and using the SQL Alchemy bulk insert functionality using insert statement for this.
insert_stmt = (
insert(...
3
votes
0
answers
229
views
Do either Python or AWS Glue provide an alternative to .NET's SqlBulkCopy?
I am porting an old SSIS package to AWS Glue. The package runs daily. In several steps in this package, I take data from one table on one Microsoft SQL Server and copy all of it to an empty table of ...
0
votes
1
answer
133
views
BULK insert not dealing with accented characters [duplicate]
I apologise for raising a question that's been asked in so many forms before but I'm tearing my hair out...
I am using BULK INSERT to load what I firmly believe to be (and Notepad++ reports to be) a ...
0
votes
1
answer
373
views
Dapper Plus BulkMerge without identity
We've been using Dapper Plus to do bulk merges using BulkMerge against SQL Server. This works pretty easily with an identity column but we've got many tables that have a primary key with no identity ...
0
votes
0
answers
108
views
What is correct way to deal with large amounts in TypeORM?
I am trying to use TypeORM for my app.
I have database structure like, Sack -> Fruit <- FruitType, so, there are sacks, each containing some fruits, and each fruit is specified type.
I tried to ...
0
votes
1
answer
316
views
Oracle SQL Loader for CLOB data type
Currently I am using oracleSQL Loader utility to load the data into the database. My CTL file is shown below. Here the column "LM_CAUTION_NOTE" data type is CLOB in the table. When I execute ...
0
votes
0
answers
124
views
Connecting to SQL Server and performing BULK INSERT from Linux Container
I have made a Linux Docker container performing several tasks, and finally connecting to a SQL Server 2019 (v15), and executing BULK INSERT to load a local (to the SQL Server) .csv file. The ...
2
votes
1
answer
360
views
DIRECTUS 1 BULK INSERT when creating X Items instead of (1 INSERT / Item to create) * X
I'm trying to find a way to have Directus executing 1 BULK INSERT in PostgreSQL when dealing with items.create scenario in a Child table instead of 1 INSERT per Item to create (which I see in the DB ...
0
votes
2
answers
81
views
Is there any way to optimize get_or_create() to make the program faster? [closed]
The functionality I want to achieve is to create a new class and import users. Since I don't want the same user to be imported multiple times and a user can join different classes, I have used ...
0
votes
2
answers
65
views
Update scenario is not working properly when using CASEs or IF statement ON DUPLICATE KEY UPDATE section in MySQL
I need to do bulk insert and update on MySQL so I used the ON DUPLICATE KEY UPDATE. I need to ignore the updated by and updated date fields as existing old data if there are no changes in those record'...
0
votes
0
answers
45
views
Why do regex style line endings no longer work with bulk insert but hex style does?
I know there are plenty of examples out there about using 0x0a for ‘ROWTERMINATOR’ but I don’t understand why that is needed. When verifying my files I can see in NP++ that they have either windows or ...
1
vote
0
answers
109
views
T-SQL Bulk insert not sorting first row at the top
The intention of the below script is to BULK INSERT from a file and always retrieve the first row of the File.csv because I need the column names that reside in the first row.
Here's an example of the ...
0
votes
0
answers
58
views
Error when inserting special characters such as Ñ and accents á, é, í, ó, ú. using BULK OPENROWSET from python in a linux server
I am having problems inserting information into a table in my SQLSERVER 2008 database from my python project deploy in a linux server using OPENROWSET BULK and even though the CSV file generates ...
1
vote
1
answer
181
views
how postgres bulk insert affect Debezium?
We have an Postgres instance with multiple db (A, B, C, D, etc).
We have debezium cdc setup on db A ONLY.But we also need to bulk insert millions of rows to other db like B.It seems that debezium cdc ...
0
votes
0
answers
174
views
Should I import 1 million rows at 1 time or import 1 million time 1 row to MySQL?
I want to ask about the most effective way to import bulk data into CSV file
I have a relatively large CSV file, which has around 20 million rows, and I need to import them into MySQL. I know that ...
1
vote
0
answers
38
views
Bulk Insert from Dataframe
Can i do Bulk Insert from dataframe instead from a csv file
here is my code
sql = " BULK INSERT " + str(tbl_name) + " FROM '" + fl_path + "' WITH (FIRSTROW = 2,...
0
votes
0
answers
59
views
Improve BulkInsert Performance
I'm loading data from csv file to database of SQL Server using python code
here is my code snippet
sql = " BULK INSERT " + str(tbl_name) + " FROM '" + fl_path + "' WITH (...
0
votes
1
answer
82
views
Error doing Oracle bulk insert/update - Unable to cast object of type 'System.DateTime[]' to type 'System.IConvertible'
I am trying to do bulk insert and/or update into an Oracle table, using data from a datatable.
When I attempt to execute "command.ExecuteNonQuery()" it throws "Unable to cast object of ...
0
votes
0
answers
571
views
Bulk insert: JdbcTemplate vs JPA
The question want to be really pointed.
I am not interested in knowing in general strengths and weaknesses of the two solutions.
I just need to analyze the performance of the two solutions in the ...