AWS S3 Bucket

Data Streamer AWS S3 Integration Guide.

The 1NCE Data Streamer Service can be integrated with AWS S3 which is an object-based storage solution. The Data Streamer can push CSV files into a S3 bucket allowing for easy, largescale data collection and further processing later on by related AWS Services. AWS S3 is integrated using AWS IAM Trust Relationships. The setup of the AWS integration can be done through the 1NCE Portal.


S3 Filename and Data Format

The S3 integration will provide the Event or Usage Records through an S3 bucket where they are uploaded as CSV files. The CSV filenames for events are "events_YYYYMMDD_HHmmss.csv" and "cdr_YYYYMMDD_HHmmss.csv" for usage records. Each file contains a collection records over a small period. A sample for an event record file type is provided below.

"id","event_start_timestamp","event_stop_timestamp","organisation_id","organisation_name","endpoint_id","sim_id","iccid","imsi","operator_id","operator_name","country_id","operator_country_name","traffic_type_id","traffic_type_description","volume","volume_tx","volume_rx","cost","currency_id","currency_code","currency_symbol","ratezone_tariff_id","ratezone_tariff_name","ratezone_id","ratezone_name","endpoint_name","endpoint_ip_address","endpoint_tags","endpoint_imei","msisdn_msisdn","sim_production_date","operator_mncs","country_mcc"
"4427264xxx","2021-05-11 11:17:25","2021-05-11 11:19:51","19xxx","8100xxxx","9673xxx","1500xxx","89882806660010xxxxx","9014051010xxxxx","4","EPlus","74","Germany","5","Data","0.000741","0.000395","0.000346","0.0007410000","1","EUR","€","442","1NCE Production 01 - 1Mbps","21xx","Rate Zone 1 (DE)","89882806660010xxxxx","x.x.x.x",,"35933907591xxxxx","8822851010xxxxx","2019-01-21 08:45:01","0x","2xx"
"4427320xxx","2021-05-11 11:17:29","2021-05-11 11:24:56","19xxx","8100xxxx","9673xxx","1500xxx","89882806660010xxxxx","9014051010xxxxx","4","EPlus","74","Germany","5","Data","0.003210","0.001803","0.001407","0.0032100000","1","EUR","€","442","1NCE Production 01 - 1Mbps","21xx","Rate Zone 1 (DE)","89882806660010xxxxx","x.x.x.x",,"35933907591xxxxx","8822851010xxxxx","2019-01-21 08:45:01","0x","2xx"

AWS S3 Configuration

The easiest setup for the stream integration into AWS S3 is by using the Cloud Formation Template via the 1NCE Portal. As a reference the used Cloud Formation Template is provided on the 1NCE GitHub page. After completing the steps, the selected record type should show up in the AWS S3 bucket. Please note that this may take some time and events/usage records need to be generated by the SIMs. If there are any issues or problems with the setup, please feel free to contact our support.

  1. Open the Portal and navigate to Configuration-Data Streams-Add New Data Stream.
  2. In the popup select AWS S3 as API Type and select the desired Stream Type.
  3. Click on Create IAM Role to open the Cloud Formation Template in a separate window.
Pop up in the Portal for creating a new Data Streamer integration.Pop up in the Portal for creating a new Data Streamer integration.

Pop up in the Portal for creating a new Data Streamer integration.

  1. Adapt the CFN Template parameters (Stack Name, S3BucketName). Do NOT change AllowedExternalID and DatastreamerRoleARN.
  2. Set the IAM Creation checkbox.
  3. Execute the CFN Stack by clicking on Create Stack.
Cloud Formation Template used to create the AWS IAM permissions and S3 bucket.Cloud Formation Template used to create the AWS IAM permissions and S3 bucket.

Cloud Formation Template used to create the AWS IAM permissions and S3 bucket.

  1. Please wait until the Cloud Formation Process has ended and all resources have been created. Once the Cloud Formation Stack has successfully finished, please proceed with the following steps.
Finished Cloud Formation stack execution.Finished Cloud Formation stack execution.

Finished Cloud Formation stack execution.

  1. Go to the Outputs tab of the created CFN Stack.
    9 Copy the shown parameters to the popup in the 1NCE Portal.
  2. Click on Save in the popup. The Data Streamer integration will be setup. Please not that this might take a few minutes.
Copy the values from the Options tab of the Cloud Formation Template to the Portal pop up.Copy the values from the Options tab of the Cloud Formation Template to the Portal pop up.

Copy the values from the Options tab of the Cloud Formation Template to the Portal pop up.


Testing AWS S3 Data Streamer

For testing an AWS S3 integration, an IoT or mobile network device (e.g., smartphone) with an active 1NCE SIM has to be used to generate Event and Usage records.

Event Records

  1. Place a 1NCE SIM into an IoT device or any other mobile device.
  2. Ensure that the mobile device allows roaming network and data connections and that the 1NCE APN is setup correctly.
  3. The attachment to a mobile network will cause a few Event records to be transmitted over the Data Streamer integration.

Usage Records

  1. Place and configure (roaming, APN, data roaming) the 1NCE SIM in a capable mobile device.
  2. For testing the two Usage record types, data and SMS, the following procedures can be executed:
  • SMS usage: Send a MO-SMS from the SIM device or send a MT-SMS using the SMS Console or API to an active 1NCE SIM.
  • Data usage: Allow data roaming, configure the APN and create a data session. Smartphones will automatically create a data session. Use some data service (e.g., IMCP Ping, TCP/UDP traffic, open a website). Close the data session by deactivating the PDP session or disconnecting the device from the network.
  1. Usage records are only written once the data and SMS volume has been actively used. Ensure that the SMS is finalized and delivered. For data usage, the current data session needs to be closed to get an immediate usage record in the Data Streamer.

AWS S3 Results

Dependent on the Data Streamer configuration the S3 bucket will be filled with CSV files. These CSV files include the Event or Usage record data from the Data Stream. The CSV filenames for events are "events_YYYYMMDD_HHmmss.csv" and "cdr_YYYYMMDD_HHmmss.csv" for usage records. Each file contains a collection records over a small period. From the S3 bucket the received data can be processed using available AWS processing tools.


Did this page help you?