Publié Fri, 10 Jan 2020 15:51:25 GMT par Raymond Papa
Is there a setting I need to change when uploading a large file?

When the file reaches 50mb, webservices uploaddocument function throws an exception 404 not found
Publié Fri, 10 Jan 2020 16:10:17 GMT par Joe Kaufman Bell Laboratories Inc Senior System Architect

According to the documentation found here:


there are two different upload methods you can use, and one is meant for larger files.

Here is C# code I use to upload files, and I set the threshold at 10 MB:

        public static Document UploadFileToFileCabinet(FileCabinet fileCabinet, string uploadFile, Document indexInfo = null)
            LastError = "";
                // We will use a standard upload method for smaller files, but the "Easy" version for larger files, otherwise
                // large file uploads might fail (the "Easy" version is meant for huge file uploads, according to the documentation).
                int largeFileThresholdInBytes = 10 * 1024 * 1024;    // 10 MB
                Document uploadedDoc = null;
                FileInfo fileInfo = new FileInfo(uploadFile);
                if (fileInfo.Length < largeFileThresholdInBytes)
                    // Smaller file.
                    uploadedDoc = fileCabinet.UploadDocument(indexInfo, fileInfo);
                    // Larger file.
                    uploadedDoc = fileCabinet.EasyUploadSingleDocument(fileInfo, indexInfo);
                return uploadedDoc;
            catch (Exception ex)
                LastError = ex.Message;
                return null;

The "Easy" method is part of some extensions, though, so I am not sure it has a direct resource end-point when doing API calls via non-NET environments.

Are you using .NET to access the API? If not, what are you using? I see in my non-NET code (Visual FoxPro) that if I need to upload a large file I shell out to the .NET program I wrote to upload files instead of using the straight HTTP endpoint to upload. So, I must have run into the same issue you are facing. The only truly reliable way to upload huge files is with the .NET code. With that code I have uploaded files greater than 1 GB in size.

Joe Kaufman

Publié Fri, 10 Jan 2020 16:16:34 GMT par Raymond Papa
I found an upload with chunks function, but wasn't sure what the setting for the chunk size, kb or mb. or just b.

I also found: but the setting mentioned isn't in the web.config.

will give the EasyUploadSingleDocument a try.

I'm coding in, but I can convert the code above to vb
Publié Fri, 10 Jan 2020 16:44:31 GMT par Raymond Papa
The easy upload fixed my issue.

I found an IIS setting, #2 in:

Is there more overhead when using EasyUploadSingleDocument vs UploadDocument
Publié Fri, 10 Jan 2020 16:59:11 GMT par Joe Kaufman Bell Laboratories Inc Senior System Architect

I don't know about overhead, all I ever did was time the uploads. The "Easy" method is a bit slower (but not much, as I recall), which is why I made a threshold to not use it unless a document was over a certain size.

If there is overhead, I imagine it would only be on the client, since that is where the file is being split out and sent in smaller chunks. Though, I suppose the server has to use a few more resources to wait for the chunks and put the file together again before saving. My knowledge of HTTP and IIS is woefully in adequate to determine whether or not there is a perceptible difference.

Glad it worked for you, in any case, since even if it does take more resources it has to be done to get these large files uploaded!

Joe Kaufman
Publié Tue, 21 Apr 2020 05:30:09 GMT par Frédéric Escudero
I have had informations about a 30 Mb limit with DocuWare Cloud. So if you want to upload larger file you need to do a chunked upload.
The process to send chunks to DocuWare was quite difficult to understand.
You need to use differents headers for your request (see code below). After each chunk sent the XML DocuWare response contains a  link to send the next chunk.
Python example :

def read_in_chunks(file_object, chunk_size=3000000):
    while True:
        data =
        if not data:
        yield data

def UploadBigFile(file, queryurl, session, url):
    content_name, file_extension = os.path.splitext(file)
    content_path = os.path.abspath(file)
    content_size = os.stat(content_path).st_size
    print('Fichier transféré : '+content_name+file_extension)
    print('Taille du fichier : '+str(content_size))
    f = open(content_path,'rb')
    index = 0
    offset = 0
    headers = {}
    for chunk in read_in_chunks(f):
        offset = index + len(chunk)
        headers['X-File-Name'] = content_name+'.'+file_extension
        headers['X-File-Type'] = 'application/'+file_extension
        headers['X-File-Size'] = str(content_size)
        headers['X-File-ModifiedDate'] = time.strftime('"%Y-%m-%dT%H:%M:%SZ"', time.gmtime(mtime))
        ContentRange = 'octets %s-%s/%s' % (str(index), str(offset), str(content_size))
        index = offset
        r = session.request("POST",queryurl, data=chunk, headers=headers)
        print("r: %s, Interval de données: %s" % (r, ContentRange))
        chunktree = ET.parse("response.xml",ET.XMLParser(encoding='ISO-8859-1'))
        chunkroot = chunktree.getroot()
        for member in chunkroot.findall('ns:FileChunk/s:Links/s:Link',namespaces):
            queryurl=url+ member.get('href')
    return r

You must be signed in to post in this forum.