Skip to main content
Filter by
Sorted by
Tagged with
0 votes
0 answers
45 views

I am loading data from Parquet into Azure SQL Database using this pipeline: Parquet → PyArrow → CSV (Azure Blob) → BULK INSERT One column in the Parquet file is binary (hashed passwords). PyArrow CSV ...
mysin's user avatar
  • 1
0 votes
1 answer
168 views

I have a scenario where I need to insert multiple records at once. I want to achieve all two: Bulk insert multiple records at once (performance). Return Eloquent model instances for the inserted ...
Ləman Əmrahova's user avatar
0 votes
0 answers
125 views

I'm trying to bulk load an ascii pipe-delimited text file with 537 varchar columns into SQL Server 2019 (v15.0.4390.2) where I only need 276 of those columns. Regardless of the methods I have tried, I ...
Soup Gyro's user avatar
1 vote
0 answers
67 views

I have built Turbodbc 5.1.2 from source with simdutf 7.3.0, Python 3.11. When trying to insert 150,000 rows of 46 columns to a MySQL 8.0 InnoDB table, Turbodb takes about 190s, compared to 15s with my ...
GBPU's user avatar
  • 684
1 vote
1 answer
103 views

Please I am trying to get this structure from a HTML form: bulk_data: [ { bank_code: "044", account_number: "1234567832", amount: 69000, ...
bugzy's user avatar
  • 31
1 vote
0 answers
50 views

I’m evaluating GridDB Community Edition 5.3 on macOS (running the Linux container in Docker Desktop) for a time-series IoT workload. The official Performance Benchmarks white-paper claims ~140 k rows/...
Panuganti Jayanth Kumar's user avatar
0 votes
0 answers
31 views

Environments OceanBase Community 4.2.1 (MySQL mode) MySQL version 5.7 We're migrating from MySQL 5.7 to OceanBase 4.2.1 (MySQL compatibility mode) and observing different auto-increment behaviors ...
user avatar
-4 votes
1 answer
67 views

I have two tables of same structure and want to insert rows from tableA into tableB. The following example works fine: INSERT INTO tableA (uniqueColA, colB, colC) SELECT uniqueColA, colB, colC ...
sweber's user avatar
  • 3,008
1 vote
2 answers
69 views

I am designing an ETL process where a .CSV file is loaded into a SQL Server table. The table only contains a single 'load' of data at any one time. At the start of the process the table is dropped and ...
JHW's user avatar
  • 136
0 votes
0 answers
48 views

How to bulk import entities with Doctrine ? Business requirements I am using Doctrine 3 to bulk import a list of entities from a CSV. Each row creates 100 entities~ and I flush it each 10 rows. All ...
Loenix's user avatar
  • 1,119
1 vote
1 answer
187 views

We receive high volume of Account & Contacts through external service (approx. 2M per day). We need to load them in Salesforce with fire and forget type of integration. Please share your thoughts ...
Shaila's user avatar
  • 11
2 votes
1 answer
165 views

I'm using SQL Server and Management Studio. I was getting data conversion errors when performing a BULK INSERT into a table, that I could not resolve and switching to using a format file allowed data ...
Will's user avatar
  • 55
2 votes
1 answer
200 views

Although this can be done with a raw query, I would like to bulk insert a small number of records and handle upsert conflicts through the gorm Create API. I need to return the IDs of records that are ...
Anthony O's user avatar
  • 672
0 votes
0 answers
131 views

I am creating a process that will load a fixed width file into SQL Server. I have used the BULK INSERT and a format file and that works in most cases. For 2 of my files, most of the rows are ...
Jeffery McGuire's user avatar
0 votes
0 answers
24 views

I have a Rails model with an after_commit callback that sends a notification to users. However, this operation takes too long, so I want to move it to a background job. Current Code: I create or ...
Amol Mohite's user avatar
0 votes
1 answer
334 views

i'm fairly new to python with sql-alchemy and just want to insert a bunch of data and receive some values back. my code looks something like this: statement = text(""" with ...
effeff's user avatar
  • 11
-7 votes
1 answer
100 views

I have a staging table in PostgreSQL and use a simple INSERT INTO ... SELECT ... query to copy data from the staging table to the final table. However, I occasionally encounter data loss issues after ...
Harish Nandoliya's user avatar
1 vote
1 answer
262 views

I am using EFCore.BulkExtensions for bulk insert, and I want to confirm how the library handles foreign key constraints during these operations. Does EFCore.BulkExtensions disable foreign key checks ...
Pavel-Alexandru Bajenaru's user avatar
0 votes
0 answers
53 views

I have a text file that I'm trying to import through SSIS (running in Visual Studio 2022) with a Bulk Insert Task. I've used it successfully from the same package for other files, but one set of files ...
Tom H's user avatar
  • 47.6k
0 votes
1 answer
168 views

I am using the T-SQL Bulk Insert command to insert large text files into tables. The tables have many nvarchar columns, I found that if the text file has empty string values for any nvarchar column, ...
Programnik's user avatar
  • 1,549
0 votes
0 answers
71 views

I want to fetch the flowfiles from the input stream in bulk and create the one bulk insertion query for Clickhouse. I am currently creating the the one insertion query for each document of mongoDB. ...
Ali Tariq's user avatar
0 votes
1 answer
543 views

I'm trying to enable BULK INSERT for a SQL Server login, but I keep encountering this error: You do not have permission to use the bulk load statement Here's what I've done so far: 1. Windows user ...
cvika7's user avatar
  • 39
0 votes
1 answer
248 views

Here is my current code in bash (based on https://redis.io/docs/latest/develop/use/patterns/bulk-loading): head=$(redis-cli -h $redis_server get .git-head) if [[ ! $head ]]; then redis-cli -h $...
mark's user avatar
  • 63.7k
0 votes
2 answers
246 views

I have the following table: ┌────────────────┬─────────────────────────────┬───────────┬──────────┬──────────────────────────────────────────────────────────────────┐ │ Column │ ...
Andy's user avatar
  • 8,891
1 vote
1 answer
112 views

I have an Excel file with 750,000 records. I first use read range to load it into a data table. To avoid issues with large volumes of data, I used a For Each loop to split the data and perform bulk ...
Arien's user avatar
  • 11
0 votes
0 answers
74 views

Entity looks like this. @Entity @Table(name = "TRANSACTION") @Data @AllArgsConstructor @NoArgsConstructor public class Transaction { @Id @Column(name = "ACCOUNT") private String ...
Ramesh N's user avatar
0 votes
1 answer
345 views

Im facing a unique constraint failed error with django. The objective of the api is, eithering creating or updating marks of the student, based on subject variation, exam_results and in bulk. For that,...
OzoneBht's user avatar
-1 votes
1 answer
89 views

I need to move some values based on expiration date, but unfortunately because the tables are not one to one in number of records I receive an error message when my query is run. I need SQL to be able ...
tcoady's user avatar
  • 13
0 votes
1 answer
553 views

I'm storing players of my game based on their level : CREATE TABLE IF NOT EXISTS Player( id INT UNSIGNED NOT NULL AUTO_INCREMENT, name VARCHAR(255) NOT NULL UNIQUE, level TINYINT UNSIGNED ...
user25485418's user avatar
1 vote
0 answers
207 views

I am working on a project where I need to insert a large number of related records into multiple tables using Entity Framework Core. The insertion process involves at least three tables: Question, ...
Fesisko's user avatar
  • 21
0 votes
0 answers
56 views

I'm trying to insert some records with DbContext.BulkInsert, and I have a column that's nvarchar(50), but as soon as I try adding a record where this column is populated with 21 characters it breaks, ...
Monset's user avatar
  • 657
0 votes
1 answer
61 views

I am trying to insert more than 10K records in postgres sql database and using the SQL Alchemy bulk insert functionality using insert statement for this. insert_stmt = ( insert(...
GroovyRatul's user avatar
3 votes
0 answers
229 views

I am porting an old SSIS package to AWS Glue. The package runs daily. In several steps in this package, I take data from one table on one Microsoft SQL Server and copy all of it to an empty table of ...
J. Mini's user avatar
  • 1,766
0 votes
1 answer
133 views

I apologise for raising a question that's been asked in so many forms before but I'm tearing my hair out... I am using BULK INSERT to load what I firmly believe to be (and Notepad++ reports to be) a ...
JustNod's user avatar
  • 53
0 votes
1 answer
373 views

We've been using Dapper Plus to do bulk merges using BulkMerge against SQL Server. This works pretty easily with an identity column but we've got many tables that have a primary key with no identity ...
borgy's user avatar
  • 1
0 votes
0 answers
108 views

I am trying to use TypeORM for my app. I have database structure like, Sack -> Fruit <- FruitType, so, there are sacks, each containing some fruits, and each fruit is specified type. I tried to ...
Mega Mraz's user avatar
0 votes
1 answer
316 views

Currently I am using oracleSQL Loader utility to load the data into the database. My CTL file is shown below. Here the column "LM_CAUTION_NOTE" data type is CLOB in the table. When I execute ...
Karthick88it's user avatar
0 votes
0 answers
124 views

I have made a Linux Docker container performing several tasks, and finally connecting to a SQL Server 2019 (v15), and executing BULK INSERT to load a local (to the SQL Server) .csv file. The ...
E. Gptreas's user avatar
2 votes
1 answer
360 views

I'm trying to find a way to have Directus executing 1 BULK INSERT in PostgreSQL when dealing with items.create scenario in a Child table instead of 1 INSERT per Item to create (which I see in the DB ...
Philippe PERON's user avatar
0 votes
2 answers
81 views

The functionality I want to achieve is to create a new class and import users. Since I don't want the same user to be imported multiple times and a user can join different classes, I have used ...
ayzhqgz's user avatar
  • 11
0 votes
2 answers
65 views

I need to do bulk insert and update on MySQL so I used the ON DUPLICATE KEY UPDATE. I need to ignore the updated by and updated date fields as existing old data if there are no changes in those record'...
Mohanraj Periyannan's user avatar
0 votes
0 answers
45 views

I know there are plenty of examples out there about using 0x0a for ‘ROWTERMINATOR’ but I don’t understand why that is needed. When verifying my files I can see in NP++ that they have either windows or ...
meteorainer's user avatar
1 vote
0 answers
109 views

The intention of the below script is to BULK INSERT from a file and always retrieve the first row of the File.csv because I need the column names that reside in the first row. Here's an example of the ...
Sauce's user avatar
  • 55
0 votes
0 answers
58 views

I am having problems inserting information into a table in my SQLSERVER 2008 database from my python project deploy in a linux server using OPENROWSET BULK and even though the CSV file generates ...
ANDREA CAROLINA CAMACHO JULIO's user avatar
1 vote
1 answer
181 views

We have an Postgres instance with multiple db (A, B, C, D, etc). We have debezium cdc setup on db A ONLY.But we also need to bulk insert millions of rows to other db like B.It seems that debezium cdc ...
BAE's user avatar
  • 9,056
0 votes
0 answers
174 views

I want to ask about the most effective way to import bulk data into CSV file I have a relatively large CSV file, which has around 20 million rows, and I need to import them into MySQL. I know that ...
thuận bùi's user avatar
1 vote
0 answers
38 views

Can i do Bulk Insert from dataframe instead from a csv file here is my code sql = " BULK INSERT " + str(tbl_name) + " FROM '" + fl_path + "' WITH (FIRSTROW = 2,...
userr's user avatar
  • 11
0 votes
0 answers
59 views

I'm loading data from csv file to database of SQL Server using python code here is my code snippet sql = " BULK INSERT " + str(tbl_name) + " FROM '" + fl_path + "' WITH (...
userr's user avatar
  • 11
0 votes
1 answer
82 views

I am trying to do bulk insert and/or update into an Oracle table, using data from a datatable. When I attempt to execute "command.ExecuteNonQuery()" it throws "Unable to cast object of ...
NoBullMan's user avatar
  • 2,252
0 votes
0 answers
571 views

The question want to be really pointed. I am not interested in knowing in general strengths and weaknesses of the two solutions. I just need to analyze the performance of the two solutions in the ...
dash1e's user avatar
  • 7,825

1
2 3 4 5
51