【问题标题】:Writing large number of records to a CSV file - Not Working将大量记录写入 CSV 文件 - 不工作
【发布时间】:2016-10-24 18:40:58
【问题描述】:

背景:我们正在构建一个应用程序 MuleSoft,作为要求的一部分,我们必须将大量记录(大约 30K)写入 csv 文件。在此之前,我们需要以 XML 形式提取数据,从 DB2 中提取独立数据。然后我们应用一些转换/映射规则,最后我们将数据写入 csv 文件并通过 FTP 传输 csv 文件。我正在附加 XML。

问题:仅处理了大约 2500-2600 条记录后,该进程在某处挂起。它没有抛出任何错误。它只是呆在那里,什么都不做。我们尝试了类似 1 的选项。将流程作为 mule 批处理流程的一部分。没有观察到差异 2. 设置最大错误计数 = -1,因为我们在博客的某处发现了这一点

如果有人可以提供任何建议,那将非常有帮助。写入文件时记录的数量是否有限制?

    <?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:batch="http://www.mulesoft.org/schema/mule/batch" xmlns:db="http://www.mulesoft.org/schema/mule/db"
    xmlns:file="http://www.mulesoft.org/schema/mule/file"
    xmlns:dw="http://www.mulesoft.org/schema/mule/ee/dw" xmlns:metadata="http://www.mulesoft.org/schema/mule/metadata"
    xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
    xmlns:spring="http://www.springframework.org/schema/beans" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.mulesoft.org/schema/mule/db http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/dw http://www.mulesoft.org/schema/mule/ee/dw/current/dw.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/batch http://www.mulesoft.org/schema/mule/batch/current/mule-batch.xsd">
    <db:generic-config name="Generic_Database_Configuration1" url="jdbc:db2://faadbcdd0017:60004/MATIUT:user=mat_adm;password=q1w2e3r4;" driverClassName="com.ibm.db2.jcc.DB2Driver" doc:name="Generic Database Configuration"/>
    <file:connector name="File" outputPattern="Carfax.csv" writeToDirectory="C:\opt\CCM\Output\IUT" autoDelete="false" outputAppend="true" streaming="true" validateConnections="true" doc:name="File"/>
    <file:connector name="File1" outputPattern="sample.txt" readFromDirectory="C:\opt\CCM" autoDelete="true" streaming="true" validateConnections="true" doc:name="File"/>
    <batch:job name="batch2Batch">
        <batch:input>
            <logger message="Startr&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;" level="INFO" doc:name="Logger"/>
            <foreach doc:name="For Each">
                <db:select config-ref="Generic_Database_Configuration1" doc:name="Database">
                    <db:parameterized-query><![CDATA[select MSG_ID,TEMPL_ID,MSG_DATA,EMAIL_CHNL_IND,PUSH_CHNL_IND, INSERT_TMSP,UID FROM IUT.message_master WHERE INSERT_TMSP between 
(CURRENT TIMESTAMP- HOUR (CURRENT TIMESTAMP) HOURS- MINUTE(CURRENT TIMESTAMP) MINUTES- SECOND(CURRENT TIMESTAMP) SECONDS
- MICROSECOND(CURRENT TIMESTAMP) MICROSECONDS) and ((CURRENT TIMESTAMP- HOUR (CURRENT TIMESTAMP) HOURS
- MINUTE(CURRENT TIMESTAMP) MINUTES- SECOND(CURRENT TIMESTAMP) SECONDS- MICROSECOND(CURRENT TIMESTAMP) MICROSECONDS) + 1 DAY) 
and SOURCE_SYS='CSS' and  ONLINE_BATCH_IND IN('Y','E') AND APPL_PROCESS_IND = 'N' with UR]]></db:parameterized-query>
                </db:select>
            </foreach>
            <logger message="#[payload]" level="INFO" doc:name="Logger"/>
        </batch:input>
        <batch:process-records>
            <batch:step name="Batch_Step">
                <component class="com.mule.object.transformer.Mapper" doc:name="Java"/>
                <dw:transform-message metadata:id="9bd2e755-065a-4208-95cf-1277f5643ee9" doc:name="Transform Message">
                    <dw:input-payload mimeType="application/java"/>
                    <dw:set-payload><![CDATA[%dw 1.0
%output application/csv separator = "|" , header = false , ignoreEmptyLine = true
---
[{
    Timestamp: payload.timeStamp,
    NotificationType: payload.notificationType,
    UID: payload.UID,
    Name: payload.messageData.firstName,
    MiddleName: payload.messageData.middleName,
    LastName: payload.messageData.lastName,
    Email: payload.messageData.email,
    HHNumber: payload.messageData.cssDataRequest.householdNumber,
    PolicyNumber: payload.messageData.cssDataRequest.policyContractNumber,
    SentDate: payload.messageData.cssDataRequest.sendDate,
    PinNumber: payload.messageData.cssDataRequest.pin,
    AOR: payload.messageData.cssDataRequest.agentOfRecord

}]]]></dw:set-payload>
                </dw:transform-message>
                <file:outbound-endpoint path="C:\opt\CCM\Output\IUT" connector-ref="File" responseTimeout="10000" doc:name="File"/>
            </batch:step>
        </batch:process-records>
        <batch:on-complete>
            <logger message="Batch2 Completed" level="INFO" doc:name="Logger"/>
        </batch:on-complete>
    </batch:job>


</mule>

【问题讨论】:

    标签: file output mule-studio recordset


    【解决方案1】:

    尝试使用批处理。在 BatchStep 内部保留一个 BatchCommit 可用于累积批处理中的所有记录。并为 Batch Commit 块设置此属性 streaming="true"。并且您的文件连接器应该在 Batch Commit 中。让我知道这是否有帮助

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2015-02-24
      • 2020-03-18
      • 1970-01-01
      相关资源
      最近更新 更多