-
Notifications
You must be signed in to change notification settings - Fork 554
Description
After spending some time figuring out what is an issue, I found out that methods
private func readBody(_ socket: Socket, size: Int) throws -> [UInt8]
were using this line to get data passed for _ in 0..<size { body.append(try socket.read()) }
Here is a method read() from Socket class:
open func read() throws -> UInt8 {
var buffer = [UInt8](repeating: 0, count: 1)
#if os(Linux)
let next = recv(self.socketFileDescriptor as Int32, &buffer, Int(buffer.count), Int32(MSG_NOSIGNAL))
#else
let next = recv(self.socketFileDescriptor as Int32, &buffer, Int(buffer.count), 0)
#endif
if next <= 0 {
throw SocketError.recvFailed(Errno.description())
}
return buffer[0]
}
I can see that you're trying to load size of one byte. As a result combination of this two methods wors really slow for photo sending(need to perform read method approximately 6 000 000 times).
I've added a variation of read methods, where you can specify length to read:
open func read(length: Int) throws -> [UInt8] {
var buffer = [UInt8](repeating: 0, count: length)
#if os(Linux)
let next = recv(self.socketFileDescriptor as Int32, &buffer, Int(buffer.count), Int32(MSG_NOSIGNAL))
#else
let next = recv(self.socketFileDescriptor as Int32, &buffer, Int(buffer.count), 0)
#endif
if next <= 0 {
throw SocketError.recvFailed(Errno.description())
}
return Array(buffer.prefix(next))
}
As well changed private func readBody(_ socket: Socket, size: Int) throws -> [UInt8]
to read as big chunks as possible(pasting with modifications that was made, please pay attention on signs on left side):
private func readBody(_ socket: Socket, size: Int) throws -> [UInt8] {
var body = [UInt8]()
- for _ in 0..<size { body.append(try socket.read()) }
+ var length = size
+ while length > 0 {
+ let buffer = try socket.read(length: length)
+ body.append(contentsOf: buffer)
+ length -= buffer.count
+ }
return body
}
I'm wondering why there was reading by one byte? Did I make it correctly, or maybe you tried preventing another issue, that may happen for me with this new way of data getting?
So far now it works fast and seems I did not receive any issue with that yet.