Integrasi API Apache Fineract
Ringkasan Eksekutif
Apache Fineract menyediakan arsitektur integrasi API yang komprehensif dan fleksibel yang memungkinkan sistem untuk berinteraksi dengan aplikasi eksternal melalui RESTful APIs, WebSocket connections, dan batch processing interfaces. Arsitektur ini dirancang untuk mendukung berbagai pola integrasi mulai dari simple data synchronization hingga complex event-driven integrations.
Arsitektur API Integration
Overall Integration Architecture
1. REST API Integration
Core API Architecture
API Design Patterns
@RestController
@RequestMapping("/api/v1")
@Validated
public class ApiIntegrationController {
@Autowired
private IntegrationService integrationService;
@Autowired
private ApiClientService apiClientService;
@Autowired
private DataTransformationService transformationService;
@Autowired
private ValidationService validationService;
/**
* Generic API endpoint untuk external integrations
*/
@PostMapping("/integration/{resourceType}")
public ResponseEntity<ApiResponse> processExternalRequest(
@PathVariable String resourceType,
@RequestBody IntegrationRequest request,
@RequestHeader("X-Tenant-ID") String tenantId,
@RequestHeader("X-Api-Key") String apiKey,
@RequestHeader("X-Request-ID") String requestId) {
// Step 1: Validate API key and permissions
if (!validateApiKey(apiKey, tenantId)) {
return ResponseEntity.status(HttpStatus.UNAUTHORIZED)
.body(ApiResponse.error("Invalid API key"));
}
// Step 2: Validate request
ValidationResult validation = validationService.validateRequest(request, resourceType);
if (!validation.isValid()) {
return ResponseEntity.badRequest()
.body(ApiResponse.error("Validation failed", validation.getErrors()));
}
// Step 3: Transform data if needed
TransformedData transformedData = transformationService.transform(request, resourceType);
// Step 4: Process request
try {
IntegrationResult result = integrationService.processExternalRequest(
resourceType, transformedData, tenantId, requestId);
return ResponseEntity.ok()
.header("X-Request-ID", requestId)
.body(ApiResponse.success(result));
} catch (IntegrationException e) {
log.error("Integration error for resource: {}", resourceType, e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(ApiResponse.error("Integration processing failed", e.getMessage()));
}
}
/**
* Batch API untuk high-volume operations
*/
@PostMapping("/integration/batch")
public ResponseEntity<BatchApiResponse> processBatchRequest(
@RequestBody BatchIntegrationRequest batchRequest,
@RequestHeader("X-Tenant-ID") String tenantId,
@RequestHeader("X-Api-Key") String apiKey) {
if (!validateApiKey(apiKey, tenantId)) {
return ResponseEntity.status(HttpStatus.UNAUTHORIZED)
.body(BatchApiResponse.error("Invalid API key"));
}
BatchApiResponse response = new BatchApiResponse();
List<IntegrationResult> results = new ArrayList<>();
List<IntegrationError> errors = new ArrayList<>();
// Process each request in batch
for (IntegrationRequest request : batchRequest.getRequests()) {
try {
IntegrationResult result = integrationService.processExternalRequest(
batchRequest.getResourceType(), request, tenantId);
results.add(result);
} catch (Exception e) {
IntegrationError error = new IntegrationError(
request.getCorrelationId(),
e.getErrorCode(),
e.getMessage()
);
errors.add(error);
}
}
response.setResults(results);
response.setErrors(errors);
response.setTotalProcessed(batchRequest.getRequests().size());
response.setSuccessful(results.size());
response.setFailed(errors.size());
return ResponseEntity.ok(response);
}
/**
* Async API untuk long-running operations
*/
@PostMapping("/integration/async/{resourceType}")
public ResponseEntity<AsyncResponse> processAsyncRequest(
@PathVariable String resourceType,
@RequestBody IntegrationRequest request,
@RequestHeader("X-Tenant-ID") String tenantId,
@RequestHeader("X-Api-Key") String apiKey) {
if (!validateApiKey(apiKey, tenantId)) {
return ResponseEntity.status(HttpStatus.UNAUTHORIZED)
.body(AsyncResponse.error("Invalid API key"));
}
// Submit for async processing
String jobId = asyncProcessingService.submitJob(
resourceType, request, tenantId);
AsyncResponse response = new AsyncResponse();
response.setJobId(jobId);
response.setStatus("SUBMITTED");
response.setEstimatedProcessingTime(300); // 5 minutes
return ResponseEntity.ok()
.header("Location", "/api/v1/integration/async/status/" + jobId)
.body(response);
}
}
// Request/Response Models
@Data
@Builder
public class IntegrationRequest {
private String action; // CREATE, UPDATE, DELETE, QUERY
private Map<String, Object> data;
private String externalId;
private String correlationId;
private Map<String, String> metadata;
private LocalDateTime timestamp;
}
@Data
@Builder
public class ApiResponse<T> {
private boolean success;
private T data;
private String message;
private List<String> errors;
private String requestId;
private LocalDateTime timestamp;
public static <T> ApiResponse<T> success(T data) {
return ApiResponse.<T>builder()
.success(true)
.data(data)
.timestamp(LocalDateTime.now())
.build();
}
public static <T> ApiResponse<T> error(String message) {
return ApiResponse.<T>builder()
.success(false)
.message(message)
.timestamp(LocalDateTime.now())
.build();
}
}
API Client Implementation
@Service
public class FineractApiClient {
private final RestTemplate restTemplate;
private final IntegrationConfig config;
private final ObjectMapper objectMapper;
private final RetryTemplate retryTemplate;
private final CircuitBreaker circuitBreaker;
@Autowired
public FineractApiClient(RestTemplate restTemplate,
IntegrationConfig config,
ObjectMapper objectMapper,
RetryTemplate retryTemplate) {
this.restTemplate = restTemplate;
this.config = config;
this.objectMapper = objectMapper;
this.retryTemplate = retryTemplate;
this.circuitBreaker = CircuitBreaker.ofDefaults("fineract-api");
}
/**
* Generic API client method dengan comprehensive error handling
*/
public <T> ApiResult<T> makeApiCall(String endpoint, HttpMethod method,
Object requestData, Class<T> responseType,
String tenantId, String apiKey) {
return circuitBreaker.executeSupplier(() ->
retryTemplate.execute(context -> {
try {
// Build URL
String url = buildApiUrl(endpoint);
// Build headers
HttpHeaders headers = buildHeaders(tenantId, apiKey);
HttpEntity<?> entity = new HttpEntity<>(requestData, headers);
log.debug("Making API call: {} {}", method, url);
// Make request
ResponseEntity<T> response = restTemplate.exchange(
url, method, entity, responseType);
if (response.getStatusCode().is2xxSuccessful()) {
return ApiResult.<T>success(response.getBody())
.withStatusCode(response.getStatusCodeValue());
} else {
return ApiResult.<T>error("API call failed")
.withStatusCode(response.getStatusCodeValue())
.withMessage(response.getStatusCode().getReasonPhrase());
}
} catch (HttpClientErrorException e) {
log.warn("HTTP client error: {} - {}", e.getStatusCode(), e.getMessage());
return ApiResult.<T>error("Client error: " + e.getMessage())
.withStatusCode(e.getStatusCode().value());
} catch (Exception e) {
log.error("Unexpected error during API call", e);
throw new IntegrationException("API call failed", e);
}
})
);
}
/**
* Client management API calls
*/
public ApiResult<ClientData> createClient(CreateClientRequest request, String tenantId) {
return makeApiCall("/clients", HttpMethod.POST, request, ClientData.class,
tenantId, config.getApiKey());
}
public ApiResult<ClientData> updateClient(Long clientId, UpdateClientRequest request,
String tenantId) {
return makeApiCall("/clients/" + clientId, HttpMethod.PUT, request, ClientData.class,
tenantId, config.getApiKey());
}
public ApiResult<List<ClientData>> searchClients(String searchText, String tenantId) {
Map<String, String> params = Map.of("searchText", searchText);
return makeApiCall("/clients", HttpMethod.GET, null,
new ParameterizedTypeReference<List<ClientData>>() {},
tenantId, params);
}
/**
* Loan management API calls
*/
public ApiResult<LoanData> createLoan(CreateLoanRequest request, String tenantId) {
return makeApiCall("/loans", HttpMethod.POST, request, LoanData.class,
tenantId, config.getApiKey());
}
public ApiResult<LoanData> approveLoan(Long loanId, ApproveLoanRequest request,
String tenantId) {
Map<String, String> params = Map.of("command", "approve");
return makeApiCall("/loans/" + loanId, HttpMethod.POST, request, LoanData.class,
tenantId, config.getApiKey(), params);
}
public ApiResult<LoanData> disburseLoan(Long loanId, DisburseLoanRequest request,
String tenantId) {
Map<String, String> params = Map.of("command", "disburse");
return makeApiCall("/loans/" + loanId, HttpMethod.POST, request, LoanData.class,
tenantId, config.getApiKey(), params);
}
public ApiResult<LoanData> processRepayment(Long loanId, LoanRepaymentRequest request,
String tenantId) {
return makeApiCall("/loans/" + loanId + "/transactions", HttpMethod.POST,
request, LoanData.class, tenantId, config.getApiKey());
}
/**
* Savings account API calls
*/
public ApiResult<SavingsAccountData> createSavingsAccount(CreateSavingsRequest request,
String tenantId) {
return makeApiCall("/savingsaccounts", HttpMethod.POST, request, SavingsAccountData.class,
tenantId, config.getApiKey());
}
public ApiResult<SavingsAccountData> depositSavings(Long accountId,
SavingsTransactionRequest request,
String tenantId) {
return makeApiCall("/savingsaccounts/" + accountId + "/transactions", HttpMethod.POST,
request, SavingsAccountData.class, tenantId, config.getApiKey());
}
/**
* Batch processing API calls
*/
public ApiResult<BatchResponse> executeBatch(BatchRequest batchRequest, String tenantId) {
return makeApiCall("/batch", HttpMethod.POST, batchRequest, BatchResponse.class,
tenantId, config.getApiKey());
}
/**
* Report generation API calls
*/
public ApiResult<String> generateReport(String reportName, Map<String, String> parameters,
String tenantId) {
return makeApiCall("/reports/" + reportName, HttpMethod.GET, null, String.class,
tenantId, config.getApiKey(), parameters);
}
private String buildApiUrl(String endpoint) {
String baseUrl = config.getBaseUrl();
if (!baseUrl.endsWith("/")) {
baseUrl += "/";
}
return baseUrl + "v1/" + endpoint.replaceFirst("^/", "");
}
private HttpHeaders buildHeaders(String tenantId, String apiKey) {
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
headers.set("X-Tenant-ID", tenantId);
headers.set("X-API-Key", apiKey);
headers.set("User-Agent", "Fineract-Integration-Client/1.0");
headers.set("Accept", "application/json");
// Add correlation ID for tracking
headers.set("X-Correlation-ID", generateCorrelationId());
return headers;
}
private String generateCorrelationId() {
return UUID.randomUUID().toString();
}
}
Data Transformation Service
@Service
public class DataTransformationService {
private final ObjectMapper objectMapper;
private final Map<String, Transformer> transformers;
@Autowired
public DataTransformationService(ObjectMapper objectMapper) {
this.objectMapper = objectMapper;
this.transformers = new HashMap<>();
registerTransformers();
}
private void registerTransformers() {
transformers.put("CLIENT", new ClientDataTransformer());
transformers.put("LOAN", new LoanDataTransformer());
transformers.put("SAVINGS_ACCOUNT", new SavingsAccountTransformer());
transformers.put("PAYMENT", new PaymentTransformer());
transformers.put("REPORT", new ReportTransformer());
}
public TransformedData transform(IntegrationRequest request, String resourceType) {
Transformer transformer = transformers.get(resourceType.toUpperCase());
if (transformer == null) {
throw new IllegalArgumentException("No transformer found for resource type: " + resourceType);
}
try {
return transformer.transform(request);
} catch (Exception e) {
throw new DataTransformationException(
"Failed to transform data for resource type: " + resourceType, e);
}
}
/**
* Client Data Transformer
*/
public static class ClientDataTransformer implements Transformer {
@Override
public TransformedData transform(IntegrationRequest request) {
Map<String, Object> inputData = request.getData();
CreateClientRequest clientRequest = CreateClientRequest.builder()
.firstName(getStringValue(inputData, "first_name"))
.lastName(getStringValue(inputData, "last_name"))
.email(getStringValue(inputData, "email"))
.mobileNumber(getStringValue(inputData, "mobile_number"))
.address(getStringValue(inputData, "address"))
.city(getStringValue(inputData, "city"))
.state(getStringValue(inputData, "state"))
.country(getStringValue(inputData, "country"))
.nationalId(getStringValue(inputData, "national_id"))
.dateOfBirth(parseDate(getStringValue(inputData, "date_of_birth")))
.gender(getStringValue(inputData, "gender"))
.build();
return TransformedData.builder()
.transformedObject(clientRequest)
.sourceFields(inputData.keySet())
.targetFields(Arrays.asList("firstName", "lastName", "email", "mobileNumber"))
.build();
}
}
/**
* Loan Data Transformer
*/
public static class LoanDataTransformer implements Transformer {
@Override
public TransformedData transform(IntegrationRequest request) {
Map<String, Object> inputData = request.getData();
CreateLoanRequest loanRequest = CreateLoanRequest.builder()
.clientId(getLongValue(inputData, "client_id"))
.productId(getLongValue(inputData, "product_id"))
.principal(getBigDecimalValue(inputData, "principal_amount"))
.termInMonths(getIntegerValue(inputData, "term_in_months"))
.interestRatePerPeriod(getBigDecimalValue(inputData, "interest_rate_per_period"))
.interestCalculationPeriodType(getStringValue(inputData, "interest_calculation_period"))
.repaymentFrequencyType(getStringValue(inputData, "repayment_frequency_type"))
.repaymentEvery(getIntegerValue(inputData, "repayment_every"))
.expectedDisbursementDate(parseDate(getStringValue(inputData, "expected_disbursement_date")))
.graceOnPrincipalPayment(getIntegerValue(inputData, "grace_on_principal_payment"))
.graceOnInterestPayment(getIntegerValue(inputData, "grace_on_interest_payment"))
.build();
return TransformedData.builder()
.transformedObject(loanRequest)
.sourceFields(inputData.keySet())
.targetFields(Arrays.asList("clientId", "principal", "interestRatePerPeriod"))
.build();
}
}
// Utility methods
private static String getStringValue(Map<String, Object> data, String key) {
Object value = data.get(key);
return value != null ? value.toString() : null;
}
private static Long getLongValue(Map<String, Object> data, String key) {
String value = getStringValue(data, key);
return value != null ? Long.valueOf(value) : null;
}
private static Integer getIntegerValue(Map<String, Object> data, String key) {
String value = getStringValue(data, key);
return value != null ? Integer.valueOf(value) : null;
}
private static BigDecimal getBigDecimalValue(Map<String, Object> data, String key) {
String value = getStringValue(data, key);
return value != null ? new BigDecimal(value) : null;
}
private static LocalDate parseDate(String dateString) {
if (dateString == null) return null;
return LocalDate.parse(dateString, DateTimeFormatter.ISO_DATE);
}
}
2. WebSocket Integration
Real-time Communication Setup
@Configuration
@EnableWebSocketMessageBroker
public class WebSocketIntegrationConfig implements WebSocketMessageBrokerConfigurer {
@Override
public void configureMessageBroker(MessageBrokerRegistry config) {
// Enable simple memory broker untuk customer-facing clients
config.enableSimpleBroker("/topic", "/queue")
.setApplicationDestinationPrefixes("/app")
.setUserDestinationPrefix("/user");
// Enable STOMP broker untuk enterprise integrations
config.enableStompBrokerRelay("/topic", "/queue")
.setTcpClient(createTcpClient());
}
@Override
public void registerStompEndpoints(StompEndpointRegistry registry) {
registry.addEndpoint("/integration-ws")
.setAllowedOriginPatterns("*")
.withSockJS()
.setHeartbeatTime(30000)
.setDisconnectDelay(30000);
registry.addEndpoint("/enterprise-ws")
.setAllowedOriginPatterns("https://*.enterprise.com")
.addInterceptors(new EnterpriseIntegrationInterceptor());
}
@Override
public void configureClientInboundChannel(ChannelRegistration registration) {
registration.interceptors(new IntegrationClientInterceptor());
}
private ReactorNettyTcpClient<byte[]> createTcpClient() {
return new ReactorNettyTcpClient<>(
TcpClient.create()
.host("message-broker")
.port(61613),
new StompEncoder()
);
}
}
@Component
public class IntegrationWebSocketHandler {
@Autowired
private SimpMessagingTemplate messagingTemplate;
@Autowired
private AuthenticationManager authenticationManager;
@Autowired
private TenantService tenantService;
/**
* Handle WebSocket connections dari external systems
*/
@MessageMapping("/integration/subscribe")
public void handleSubscription(Principal principal,
MessageHeaders headers,
SubscribeMessage message) {
String userId = ((Authentication) principal).getName();
String sessionId = getSessionId(headers);
String tenantId = getTenantId(headers);
// Validate tenant access
if (!tenantService.isUserAllowedInTenant(userId, tenantId)) {
throw new UnauthorizedException("User not authorized for tenant: " + tenantId);
}
// Register subscription
UserSubscription subscription = UserSubscription.builder()
.userId(userId)
.sessionId(sessionId)
.tenantId(tenantId)
.subscription(message.getDestination())
.filters(message.getFilters())
.build();
subscriptionManager.addSubscription(subscription);
// Send initial data
sendInitialData(subscription);
log.info("User {} subscribed to {} for tenant {}", userId,
message.getDestination(), tenantId);
}
/**
* Send real-time updates ke external systems
*/
public void sendRealTimeUpdate(RealTimeUpdate update) {
String destination = "/topic/integration/" + update.getEntityType();
IntegrationMessage message = IntegrationMessage.builder()
.eventType(update.getEventType())
.entityId(update.getEntityId())
.entityType(update.getEntityType())
.data(update.getData())
.tenantId(update.getTenantId())
.timestamp(Instant.now())
.correlationId(update.getCorrelationId())
.build();
// Send to all subscribed sessions for this entity
Set<UserSubscription> subscriptions = subscriptionManager
.getSubscriptions(update.getEntityType(), update.getEntityId());
for (UserSubscription subscription : subscriptions) {
if (matchesFilters(subscription.getFilters(), update)) {
messagingTemplate.convertAndSendToUser(
subscription.getUserId(), destination, message);
}
}
}
/**
* Handle loan status changes
*/
@EventListener
public void handleLoanStatusChange(LoanStatusChangeEvent event) {
RealTimeUpdate update = RealTimeUpdate.builder()
.eventType("LOAN_STATUS_CHANGED")
.entityType("LOAN")
.entityId(event.getLoanId())
.data(Map.of(
"oldStatus", event.getOldStatus(),
"newStatus", event.getNewStatus(),
"changeDate", event.getChangeDate().toString(),
"changedBy", event.getChangedBy()
))
.tenantId(event.getTenantId())
.correlationId(event.getCorrelationId())
.build();
sendRealTimeUpdate(update);
}
/**
* Handle payment processing
*/
@EventListener
public void handlePaymentProcessed(PaymentProcessedEvent event) {
RealTimeUpdate update = RealTimeUpdate.builder()
.eventType("PAYMENT_PROCESSED")
.entityType("PAYMENT")
.entityId(event.getPaymentId())
.data(Map.of(
"loanId", event.getLoanId(),
"clientId", event.getClientId(),
"amount", event.getAmount(),
"paymentDate", event.getPaymentDate().toString(),
"paymentType", event.getPaymentType()
))
.tenantId(event.getTenantId())
.correlationId(event.getCorrelationId())
.build();
sendRealTimeUpdate(update);
}
}
WebSocket Client Implementation
// JavaScript WebSocket Client untuk External Integration
class FineractWebSocketClient {
constructor(baseUrl, apiKey, tenantId) {
this.baseUrl = baseUrl;
this.apiKey = apiKey;
this.tenantId = tenantId;
this.socket = null;
this.subscriptions = new Map();
this.reconnectAttempts = 0;
this.maxReconnectAttempts = 5;
this.reconnectDelay = 1000;
}
connect() {
const url = `${this.baseUrl}/integration-ws?tenant=${this.tenantId}`;
this.socket = new SockJS(url);
this.socket.onopen = () => {
console.log('Connected to Fineract WebSocket');
this.reconnectAttempts = 0;
this.authenticate();
};
this.socket.onmessage = (event) => {
this.handleMessage(JSON.parse(event.data));
};
this.socket.onclose = (event) => {
console.log('Disconnected from Fineract WebSocket');
this.attemptReconnect();
};
this.socket.onerror = (error) => {
console.error('WebSocket error:', error);
};
}
authenticate() {
const authMessage = {
command: 'AUTH',
apiKey: this.apiKey,
tenantId: this.tenantId
};
this.socket.send(JSON.stringify(authMessage));
}
subscribe(entityType, entityId, filters = {}) {
const subscriptionId = `${entityType}:${entityId}`;
const subscribeMessage = {
command: 'SUBSCRIBE',
destination: `/topic/integration/${entityType}`,
entityType: entityType,
entityId: entityId,
filters: filters
};
this.socket.send(JSON.stringify(subscribeMessage));
this.subscriptions.set(subscriptionId, { entityType, entityId, filters });
}
unsubscribe(entityType, entityId) {
const subscriptionId = `${entityType}:${entityId}`;
const unsubscribeMessage = {
command: 'UNSUBSCRIBE',
destination: `/topic/integration/${entityType}`,
entityType: entityType,
entityId: entityId
};
this.socket.send(JSON.stringify(unsubscribeMessage));
this.subscriptions.delete(subscriptionId);
}
handleMessage(message) {
switch (message.eventType) {
case 'AUTH_SUCCESS':
console.log('Authentication successful');
this.resubscribeAll();
break;
case 'AUTH_FAILED':
console.error('Authentication failed:', message.error);
this.disconnect();
break;
case 'LOAN_STATUS_CHANGED':
this.onLoanStatusChanged(message);
break;
case 'PAYMENT_PROCESSED':
this.onPaymentProcessed(message);
break;
case 'CLIENT_UPDATED':
this.onClientUpdated(message);
break;
case 'SAVINGS_ACCOUNT_UPDATED':
this.onSavingsAccountUpdated(message);
break;
default:
console.log('Unknown event type:', message.eventType);
}
}
onLoanStatusChanged(message) {
// Handle loan status change
console.log('Loan status changed:', message);
// Emit event to application code
this.emit('loanStatusChanged', message);
// Trigger custom handlers
this.handlers.loanStatusChanged?.forEach(handler => {
handler(message);
});
}
onPaymentProcessed(message) {
// Handle payment processing
console.log('Payment processed:', message);
this.emit('paymentProcessed', message);
this.handlers.paymentProcessed?.forEach(handler => {
handler(message);
});
}
// Event emitter pattern
on(eventName, handler) {
if (!this.handlers[eventName]) {
this.handlers[eventName] = [];
}
this.handlers[eventName].push(handler);
}
emit(eventName, data) {
if (this.handlers[eventName]) {
this.handlers[eventName].forEach(handler => handler(data));
}
}
attemptReconnect() {
if (this.reconnectAttempts < this.maxReconnectAttempts) {
this.reconnectAttempts++;
setTimeout(() => {
console.log(`Reconnecting... Attempt ${this.reconnectAttempts}`);
this.connect();
}, this.reconnectDelay * this.reconnectAttempts);
} else {
console.error('Max reconnection attempts reached');
this.emit('maxReconnectAttemptsReached');
}
}
resubscribeAll() {
this.subscriptions.forEach((subscription, id) => {
this.subscribe(subscription.entityType, subscription.entityId, subscription.filters);
});
}
disconnect() {
if (this.socket) {
this.socket.close();
this.socket = null;
}
this.subscriptions.clear();
}
}
// Usage example
const client = new FineractWebSocketClient(
'https://api.fineract.com',
'your-api-key',
'tenant-123'
);
// Connect
client.connect();
// Set up event handlers
client.on('loanStatusChanged', (message) => {
console.log('Loan status changed:', message);
// Update UI
updateLoanStatus(message.entityId, message.data.newStatus);
// Send notification
showNotification(`Loan ${message.entityId} status changed to ${message.data.newStatus}`);
});
client.on('paymentProcessed', (message) => {
console.log('Payment processed:', message);
// Update balance
updateLoanBalance(message.data.loanId, message.data.newBalance);
// Log transaction
logTransaction(message);
});
// Subscribe to specific entities
client.subscribe('LOAN', 12345);
client.subscribe('CLIENT', 67890);
// Subscribe with filters
client.subscribe('PAYMENT', null, {
loanId: 12345,
amountGreaterThan: 1000
});
3. Batch Processing Integration
Batch Operations Implementation
@Service
public class BatchIntegrationService {
@Autowired
private ExecutorService batchExecutor;
@Autowired
private BatchRepository batchRepository;
@Autowired
private IntegrationMetrics metrics;
/**
* Process large batch of integration requests
*/
public BatchJobResult processBatch(BatchIntegrationRequest request) {
BatchJob job = BatchJob.builder()
.requestId(request.getRequestId())
.tenantId(request.getTenantId())
.totalItems(request.getItems().size())
.status(BatchStatus.PENDING)
.startedAt(Instant.now())
.build();
batchRepository.save(job);
// Submit for async processing
CompletableFuture<BatchJobResult> future = CompletableFuture.supplyAsync(() -> {
return processBatchAsync(job, request);
}, batchExecutor);
job.setStatus(BatchStatus.PROCESSING);
job.setJobId(future.hashCode());
batchRepository.save(job);
return BatchJobResult.builder()
.jobId(job.getJobId())
.status("SUBMITTED")
.estimatedDuration(Duration.ofSeconds(request.getItems().size() / 100))
.build();
}
private BatchJobResult processBatchAsync(BatchJob job, BatchIntegrationRequest request) {
List<BatchItemResult> results = new ArrayList<>();
List<BatchItemError> errors = new ArrayList<>();
int processed = 0;
int successes = 0;
int failures = 0;
// Process items in chunks untuk better performance
List<List<IntegrationRequest>> chunks = splitIntoChunks(request.getItems(), 100);
for (List<IntegrationRequest> chunk : chunks) {
// Process chunk in parallel
List<CompletableFuture<BatchItemResult>> futures = chunk.stream()
.map(item -> CompletableFuture.supplyAsync(() -> processBatchItem(item, request)))
.collect(Collectors.toList());
// Wait for chunk to complete
CompletableFuture<Void> allOf = CompletableFuture.allOf(
futures.toArray(new CompletableFuture[0]));
allOf.join();
// Collect results
for (CompletableFuture<BatchItemResult> future : futures) {
BatchItemResult result = future.join();
if (result.isSuccess()) {
results.add(result);
successes++;
} else {
errors.add(result.getError());
failures++;
}
processed++;
// Update progress
updateBatchProgress(job.getJobId(), processed, job.getTotalItems());
}
}
// Finalize job
job.setStatus(BatchStatus.COMPLETED);
job.setCompletedAt(Instant.now());
job.setTotalProcessed(processed);
job.setSuccessful(successes);
job.setFailed(failures);
batchRepository.save(job);
// Update metrics
metrics.recordBatchProcessed(job.getTotalItems(),
Duration.between(job.getStartedAt(), job.getCompletedAt()));
return BatchJobResult.builder()
.jobId(job.getJobId())
.status("COMPLETED")
.totalItems(job.getTotalItems())
.successful(successes)
.failed(failures)
.processed(processed)
.duration(Duration.between(job.getStartedAt(), job.getCompletedAt()))
.results(results)
.errors(errors)
.build();
}
private BatchItemResult processBatchItem(IntegrationRequest item, BatchIntegrationRequest request) {
try {
// Transform data
TransformedData transformedData = transformationService
.transform(item, request.getResourceType());
// Process request
IntegrationResult result = integrationService
.processExternalRequest(request.getResourceType(), transformedData,
request.getTenantId(), item.getCorrelationId());
return BatchItemResult.builder()
.correlationId(item.getCorrelationId())
.success(true)
.result(result)
.build();
} catch (Exception e) {
BatchItemError error = BatchItemError.builder()
.correlationId(item.getCorrelationId())
.errorCode("PROCESSING_ERROR")
.errorMessage(e.getMessage())
.itemData(item)
.build();
return BatchItemResult.builder()
.correlationId(item.getCorrelationId())
.success(false)
.error(error)
.build();
}
}
/**
* Check batch job status
*/
public BatchJobStatus getBatchStatus(String jobId) {
Optional<BatchJob> job = batchRepository.findByJobId(jobId);
if (job.isEmpty()) {
throw new JobNotFoundException("Batch job not found: " + jobId);
}
return BatchJobStatus.builder()
.jobId(jobId)
.status(job.get().getStatus().name())
.totalItems(job.get().getTotalItems())
.processed(job.get().getTotalProcessed())
.successful(job.get().getSuccessful())
.failed(job.get().getFailed())
.startedAt(job.get().getStartedAt())
.completedAt(job.get().getCompletedAt())
.progress(calculateProgress(job.get()))
.estimatedTimeRemaining(estimateTimeRemaining(job.get()))
.build();
}
/**
* Download batch results
*/
public BatchDownloadResult downloadResults(String jobId, String format) {
Optional<BatchJob> job = batchRepository.findByJobId(jobId);
if (job.isEmpty() || job.get().getStatus() != BatchStatus.COMPLETED) {
throw new JobNotReadyException("Batch job not completed");
}
List<BatchItemResult> results = batchRepository.getBatchResults(jobId);
return format switch {
case "CSV" -> exportToCsv(results);
case "JSON" -> exportToJson(results);
case "EXCEL" -> exportToExcel(results);
default -> throw new UnsupportedOperationException("Format not supported: " + format);
};
}
private <T> List<List<T>> splitIntoChunks(List<T> list, int chunkSize) {
List<List<T>> chunks = new ArrayList<>();
for (int i = 0; i < list.size(); i += chunkSize) {
chunks.add(list.subList(i, Math.min(i + chunkSize, list.size())));
}
return chunks;
}
}
Bulk Data Import/Export
@RestController
@RequestMapping("/api/v1/integration/bulk")
public class BulkIntegrationController {
@Autowired
private BulkImportService bulkImportService;
@Autowired
private BulkExportService bulkExportService;
@PostMapping("/import/{dataType}")
public ResponseEntity<ImportJobResult> importData(
@PathVariable String dataType,
@RequestParam("file") MultipartFile file,
@RequestParam(value = "format", defaultValue = "CSV") String format,
@RequestHeader("X-Tenant-ID") String tenantId,
@RequestHeader("X-Api-Key") String apiKey) {
if (!validateApiKey(apiKey, tenantId)) {
return ResponseEntity.status(HttpStatus.UNAUTHORIZED)
.body(ImportJobResult.error("Invalid API key"));
}
try {
ImportJobResult result = bulkImportService.importData(
dataType, file, format, tenantId);
return ResponseEntity.accepted()
.body(result);
} catch (Exception e) {
log.error("Bulk import failed for data type: {}", dataType, e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(ImportJobResult.error("Import failed: " + e.getMessage()));
}
}
@GetMapping("/export/{dataType}")
public ResponseEntity<Resource> exportData(
@PathVariable String dataType,
@RequestParam Map<String, String> parameters,
@RequestParam(defaultValue = "CSV") String format,
@RequestParam(defaultValue = "10000") int limit,
@RequestHeader("X-Tenant-ID") String tenantId,
@RequestHeader("X-Api-Key") String apiKey) {
if (!validateApiKey(apiKey, tenantId)) {
return ResponseEntity.status(HttpStatus.UNAUTHORIZED).build();
}
try {
ExportResult result = bulkExportService.exportData(
dataType, parameters, format, limit, tenantId);
return ResponseEntity.ok()
.header("Content-Disposition", "attachment; filename=\"" + result.getFilename() + "\"")
.contentType(MediaType.parseMediaType(result.getContentType()))
.body(result.getResource());
} catch (Exception e) {
log.error("Bulk export failed for data type: {}", dataType, e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
}
}
}
@Service
public class BulkImportService {
@Autowired
private FileProcessorFactory fileProcessorFactory;
@Autowired
private DataValidationService validationService;
@Autowired
private BatchProcessor batchProcessor;
public ImportJobResult importData(String dataType, MultipartFile file,
String format, String tenantId) {
ImportJob job = ImportJob.builder()
.id(generateJobId())
.dataType(dataType)
.filename(file.getOriginalFilename())
.format(format)
.tenantId(tenantId)
.status(ImportStatus.PROCESSING)
.startedAt(Instant.now())
.build();
try {
// Process file based on format
FileProcessor processor = fileProcessorFactory.getProcessor(format);
List<IntegrationRequest> records = processor.parseFile(file);
job.setTotalRecords(records.size());
// Validate records
ValidationResult validation = validationService.validateBulkData(records, dataType);
if (!validation.isValid()) {
job.setStatus(ImportStatus.FAILED);
job.setErrorMessage("Validation failed: " + validation.getErrorMessage());
return ImportJobResult.error("Validation failed", validation);
}
job.setValidRecords(validation.getValidRecords().size());
job.setInvalidRecords(validation.getInvalidRecords().size());
// Process valid records
if (!validation.getValidRecords().isEmpty()) {
BatchJobResult result = batchProcessor.processBatch(
BatchIntegrationRequest.builder()
.resourceType(dataType)
.items(validation.getValidRecords())
.tenantId(tenantId)
.build());
job.setProcessedRecords(result.getSuccessful());
job.setFailedRecords(result.getFailed());
}
job.setStatus(ImportStatus.COMPLETED);
job.setCompletedAt(Instant.now());
return ImportJobResult.builder()
.jobId(job.getId())
.status("COMPLETED")
.totalRecords(job.getTotalRecords())
.processedRecords(job.getProcessedRecords())
.failedRecords(job.getFailedRecords())
.duration(Duration.between(job.getStartedAt(), job.getCompletedAt()))
.build();
} catch (Exception e) {
job.setStatus(ImportStatus.FAILED);
job.setErrorMessage(e.getMessage());
throw new ImportException("Import failed", e);
}
}
}
4. Message Queue Integration
Event-Driven Integration
@Configuration
@EnableJms
public class IntegrationMessageQueueConfig {
@Bean
public JmsListenerContainerFactory<?> jmsListenerContainerFactory() {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(connectionFactory());
factory.setConcurrency("5-10");
factory.setSessionAcknowledgeMode(Session.CLIENT_ACKNOWLEDGE);
return factory;
}
@Bean
public JmsTemplate jmsTemplate() {
return new JmsTemplate(connectionFactory());
}
@Bean
public Queue integrationQueue() {
return new ActiveMQQueue("fineract.integration.queue");
}
@Bean
public Topic integrationTopic() {
return new ActiveMQTopic("fineract.integration.topic");
}
}
@Component
public class IntegrationMessageProducer {
@Autowired
private JmsTemplate jmsTemplate;
@Autowired
private IntegrationConfig config;
public void sendIntegrationMessage(IntegrationMessage message) {
try {
jmsTemplate.convertAndSend("fineract.integration.queue", message);
log.debug("Sent integration message: {}", message.getType());
} catch (Exception e) {
log.error("Failed to send integration message", e);
throw new MessageSendingException("Failed to send message", e);
}
}
public void publishIntegrationEvent(IntegrationEvent event) {
try {
jmsTemplate.convertAndSend("fineract.integration.topic", event);
log.debug("Published integration event: {}", event.getType());
} catch (Exception e) {
log.error("Failed to publish integration event", e);
throw new MessagePublishingException("Failed to publish event", e);
}
}
/**
* Send notification to external CRM system
*/
public void notifyCRM(CRMNotification notification) {
IntegrationMessage message = IntegrationMessage.builder()
.type("CRM_NOTIFICATION")
.targetSystem("CRM")
.tenantId(notification.getTenantId())
.payload(notification)
.timestamp(Instant.now())
.correlationId(notification.getCorrelationId())
.priority(Priority.HIGH)
.build();
sendIntegrationMessage(message);
}
/**
* Publish loan creation event
*/
public void publishLoanCreatedEvent(LoanCreatedEvent event) {
IntegrationEvent integrationEvent = IntegrationEvent.builder()
.type("LOAN_CREATED")
.source("fineract")
.tenantId(event.getTenantId())
.entityType("LOAN")
.entityId(event.getLoanId())
.data(event)
.timestamp(Instant.now())
.correlationId(event.getCorrelationId())
.build();
publishIntegrationEvent(integrationEvent);
}
}
@Component
public class IntegrationMessageConsumer {
@Autowired
private ExternalSystemService externalSystemService;
@Autowired
private IntegrationMetrics metrics;
@JmsListener(destination = "fineract.integration.queue")
public void handleIntegrationMessage(IntegrationMessage message) {
try {
log.debug("Received integration message: {} for system: {}",
message.getType(), message.getTargetSystem());
// Process based on target system
switch (message.getTargetSystem()) {
case "CRM":
processCRMMessage(message);
break;
case "PAYMENT_GATEWAY":
processPaymentGatewayMessage(message);
break;
case "CORE_BANKING":
processCoreBankingMessage(message);
break;
case "ANALYTICS":
processAnalyticsMessage(message);
break;
default:
log.warn("Unknown target system: {}", message.getTargetSystem());
}
metrics.recordMessageProcessed(message.getType());
} catch (Exception e) {
log.error("Failed to process integration message", e);
throw new MessageProcessingException("Message processing failed", e);
}
}
private void processCRMMessage(IntegrationMessage message) {
CRMNotification notification = (CRMNotification) message.getPayload();
try {
switch (notification.getAction()) {
case "CLIENT_CREATED":
externalSystemService.syncClientToCRM(notification.getClientData());
break;
case "LOAN_DISBURSED":
externalSystemService.updateLoanInCRM(notification.getLoanData());
break;
case "PAYMENT_RECEIVED":
externalSystemService.recordPaymentInCRM(notification.getPaymentData());
break;
}
log.info("Successfully processed CRM notification: {}", notification.getAction());
} catch (ExternalSystemException e) {
log.error("Failed to sync with CRM system", e);
// Implement retry logic atau dead letter queue
handleExternalSystemFailure(message, e);
}
}
private void processPaymentGatewayMessage(IntegrationMessage message) {
PaymentGatewayNotification notification = (PaymentGatewayNotification) message.getPayload();
// Process payment gateway notifications
try {
externalSystemService.processPaymentGatewayNotification(notification);
metrics.recordPaymentProcessed(notification.getAmount());
} catch (Exception e) {
log.error("Failed to process payment gateway message", e);
}
}
private void handleExternalSystemFailure(IntegrationMessage message, ExternalSystemException e) {
// Retry logic
if (message.getRetryCount() < 3) {
message.setRetryCount(message.getRetryCount() + 1);
message.setNextRetryAt(Instant.now().plusSeconds(Math.pow(2, message.getRetryCount())));
// Schedule retry
retryScheduler.scheduleRetry(message);
} else {
// Move to dead letter queue
deadLetterQueue.add(message);
log.error("Message moved to dead letter queue after 3 retries: {}", message);
}
}
}
5. API Rate Limiting dan Quotas
Rate Limiting Implementation
@Component
public class ApiRateLimitService {
private final Map<String, RateLimitTracker> userRateLimits = new ConcurrentHashMap<>();
private final Map<String, RateLimitTracker> apiKeyRateLimits = new ConcurrentHashMap<>();
private final Map<String, RateLimitTracker> tenantRateLimits = new ConcurrentHashMap<>();
private static final int DEFAULT_REQUESTS_PER_MINUTE = 1000;
private static final int DEFAULT_REQUESTS_PER_HOUR = 10000;
private static final int DEFAULT_REQUESTS_PER_DAY = 100000;
public RateLimitResult checkRateLimit(String userId, String apiKey, String tenantId,
String endpoint) {
RateLimitTracker userTracker = getOrCreateTracker(userRateLimits, userId);
RateLimitTracker apiKeyTracker = getOrCreateTracker(apiKeyRateLimits, apiKey);
RateLimitTracker tenantTracker = getOrCreateTracker(tenantRateLimits, tenantId);
// Check multiple rate limits
RateLimitResult userResult = checkTracker(userTracker, endpoint);
RateLimitResult apiKeyResult = checkTracker(apiKeyTracker, endpoint);
RateLimitResult tenantResult = checkTracker(tenantTracker, endpoint);
// Return most restrictive result
return Arrays.asList(userResult, apiKeyResult, tenantResult).stream()
.filter(result -> !result.isAllowed())
.findFirst()
.orElse(RateLimitResult.allowed());
}
private RateLimitTracker getOrCreateTracker(Map<String, RateLimitTracker> trackerMap, String key) {
return trackerMap.computeIfAbsent(key, k -> {
RateLimitConfig config = getRateLimitConfig(key);
return new RateLimitTracker(config);
});
}
private RateLimitResult checkTracker(RateLimitTracker tracker, String endpoint) {
RateLimitResult result = tracker.checkLimit();
if (!result.isAllowed()) {
log.warn("Rate limit exceeded for endpoint: {} - {}", endpoint, result.getMessage());
}
return result;
}
private RateLimitConfig getRateLimitConfig(String identifier) {
// Get rate limit configuration dari database atau configuration service
return RateLimitConfig.builder()
.requestsPerMinute(getConfiguredLimit(identifier, "per_minute", DEFAULT_REQUESTS_PER_MINUTE))
.requestsPerHour(getConfiguredLimit(identifier, "per_hour", DEFAULT_REQUESTS_PER_HOUR))
.requestsPerDay(getConfiguredLimit(identifier, "per_day", DEFAULT_REQUESTS_PER_DAY))
.build();
}
}
public class RateLimitTracker {
private final RateLimitConfig config;
private final Map<TimeWindow, Counter> counters = new ConcurrentHashMap<>();
private final Instant windowStart = Instant.now();
public RateLimitTracker(RateLimitConfig config) {
this.config = config;
initializeCounters();
}
public RateLimitResult checkLimit() {
// Check current minute
if (checkWindow(TimeWindow.MINUTE, config.getRequestsPerMinute())) {
return RateLimitResult.rejected("Rate limit exceeded: too many requests per minute");
}
// Check current hour
if (checkWindow(TimeWindow.HOUR, config.getRequestsPerHour())) {
return RateLimitResult.rejected("Rate limit exceeded: too many requests per hour");
}
// Check current day
if (checkWindow(TimeWindow.DAY, config.getRequestsPerDay())) {
return RateLimitResult.rejected("Rate limit exceeded: too many requests per day");
}
// Increment counters
incrementCounters();
return RateLimitResult.allowed();
}
private boolean checkWindow(TimeWindow window, int limit) {
Counter counter = counters.get(window);
return counter.getCount() >= limit;
}
private void incrementCounters() {
counters.values().forEach(Counter::increment);
}
private void initializeCounters() {
counters.put(TimeWindow.MINUTE, new Counter());
counters.put(TimeWindow.HOUR, new Counter());
counters.put(TimeWindow.DAY, new Counter());
}
}
API Quota Management
@Service
public class ApiQuotaService {
@Autowired
private QuotaRepository quotaRepository;
@Autowired
private UsageTracker usageTracker;
/**
* Check API quota before processing request
*/
public QuotaResult checkQuota(String apiKey, String tenantId, String quotaType) {
QuotaConfig quota = getQuotaConfig(apiKey, tenantId, quotaType);
UsageStats usage = usageTracker.getCurrentUsage(apiKey, tenantId, quotaType);
if (usage.isQuotaExceeded()) {
return QuotaResult.exceeded(quota, usage);
}
return QuotaResult.withinLimits(quota, usage);
}
/**
* Record API usage
*/
public void recordUsage(String apiKey, String tenantId, String quotaType,
int requestCount, BigDecimal dataVolume) {
UsageStats usage = UsageStats.builder()
.apiKey(apiKey)
.tenantId(tenantId)
.quotaType(quotaType)
.requestCount(requestCount)
.dataVolume(dataVolume)
.periodStart(getCurrentPeriodStart(quotaType))
.periodEnd(getCurrentPeriodEnd(quotaType))
.build();
usageTracker.recordUsage(usage);
// Update quotas in database
updateQuotaUsage(apiKey, tenantId, quotaType, requestCount, dataVolume);
}
/**
* Get quota usage report
*/
public QuotaUsageReport getQuotaUsageReport(String apiKey, String tenantId,
String period) {
List<UsageStats> usageHistory = usageTracker.getUsageHistory(
apiKey, tenantId, period);
QuotaConfig currentQuota = getCurrentQuota(apiKey, tenantId);
return QuotaUsageReport.builder()
.apiKey(apiKey)
.tenantId(tenantId)
.period(period)
.quotaConfig(currentQuota)
.usageHistory(usageHistory)
.currentUsage(calculateCurrentUsage(usageHistory))
.projectedUsage(calculateProjectedUsage(usageHistory))
.recommendations(generateRecommendations(currentQuota, usageHistory))
.build();
}
/**
* Alert when approaching quota limits
*/
@Scheduled(fixedRate = 300000) // Every 5 minutes
public void checkQuotaAlerts() {
List<UsageStats> approachingLimits = usageTracker.getApproachingQuotaLimits();
for (UsageStats usage : approachingLimits) {
if (usage.getUsagePercentage() >= 80) {
sendQuotaAlert(usage);
}
}
}
private void sendQuotaAlert(UsageStats usage) {
QuotaAlert alert = QuotaAlert.builder()
.apiKey(usage.getApiKey())
.tenantId(usage.getTenantId())
.quotaType(usage.getQuotaType())
.usagePercentage(usage.getUsagePercentage())
.requestCount(usage.getRequestCount())
.dataVolume(usage.getDataVolume())
.period(usage.getPeriod())
.build();
notificationService.sendQuotaAlert(alert);
}
}
@Entity
@Table(name = "api_quota_config")
public class ApiQuotaConfig {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@Column(name = "api_key", nullable = false)
private String apiKey;
@Column(name = "tenant_id", nullable = false)
private String tenantId;
@Column(name = "quota_type", nullable = false)
private String quotaType;
@Column(name = "monthly_requests_limit")
private Integer monthlyRequestsLimit;
@Column(name = "monthly_data_volume_limit")
private BigDecimal monthlyDataVolumeLimit;
@Column(name = "concurrent_requests_limit")
private Integer concurrentRequestsLimit;
@Column(name = "rate_limit_per_minute")
private Integer rateLimitPerMinute;
@Column(name = "is_active", nullable = false)
private boolean active = true;
@Column(name = "created_at", nullable = false)
private LocalDateTime createdAt;
@Column(name = "updated_at", nullable = false)
private LocalDateTime updatedAt;
}
6. API Documentation dan Testing
OpenAPI/Swagger Integration
@Configuration
@OpenAPIDefinition(
info = @Info(
title = "Apache Fineract Integration API",
version = "1.0",
description = "API untuk integrasi dengan sistem eksternal",
contact = @Contact(
name = "Fineract Integration Team",
email = "integration@fineract.org"
)
),
servers = {
@Server(url = "https://api.fineract.com", description = "Production server"),
@Server(url = "https://staging-api.fineract.com", description = "Staging server"),
@Server(url = "https://dev-api.fineract.com", description = "Development server")
},
security = {
@SecurityRequirement(name = "ApiKeyAuth"),
@SecurityRequirement(name = "BearerAuth")
}
)
@SecurityScheme(
name = "ApiKeyAuth",
type = SecuritySchemeType.APIKEY,
in = SecuritySchemeIn.HEADER,
paramName = "X-API-Key"
)
@SecurityScheme(
name = "BearerAuth",
type = SecuritySchemeType.HTTP,
scheme = "bearer",
bearerFormat = "JWT"
)
public class OpenApiConfig {
@Bean
public OpenAPI customOpenAPI() {
return new OpenAPI()
.components(new Components()
.addSecuritySchemes("ApiKeyAuth",
new SecurityScheme()
.type(SecuritySchemeType.APIKEY)
.in(SecuritySchemeIn.HEADER)
.name("X-API-Key")
.description("API Key untuk authentication"))
.addSecuritySchemes("BearerAuth",
new SecurityScheme()
.type(SecuritySchemeType.HTTP)
.scheme("bearer")
.bearerFormat("JWT")
.description("JWT Bearer token authentication")))
.addSecurityItem(new SecurityRequirement().addList("ApiKeyAuth"))
.addSecurityItem(new SecurityRequirement().addList("BearerAuth"));
}
}
@RestController
@RequestMapping("/api/v1/integration")
@Api(value = "Integration API", tags = {"Integration"})
@Validated
public class IntegrationApiDocumentation {
@PostMapping("/clients")
@ApiOperation(
value = "Create Client",
notes = "Membuat client baru dalam sistem Fineract",
response = ClientData.class
)
@ApiResponses(value = {
@ApiResponse(code = 201, message = "Client berhasil dibuat"),
@ApiResponse(code = 400, message = "Data client tidak valid"),
@ApiResponse(code = 401, message = "API key tidak valid"),
@ApiResponse(code = 409, message = "Client dengan identifier tersebut sudah ada")
})
public ResponseEntity<ApiResponse<ClientData>> createClient(
@ApiParam(value = "Data client yang akan dibuat", required = true)
@Valid @RequestBody CreateClientRequest request,
@ApiParam(value = "ID tenant", required = true)
@RequestHeader("X-Tenant-ID") String tenantId) {
// Implementation
return ResponseEntity.ok(ApiResponse.success(clientData));
}
@GetMapping("/clients/{clientId}")
@ApiOperation(
value = "Get Client Details",
notes = "Mengambil detail client berdasarkan ID",
response = ClientData.class
)
@ApiResponses(value = {
@ApiResponse(code = 200, message = "Detail client berhasil diambil"),
@ApiResponse(code = 404, message = "Client tidak ditemukan"),
@ApiResponse(code = 401, message = "API key tidak valid")
})
public ResponseEntity<ApiResponse<ClientData>> getClient(
@ApiParam(value = "ID client", required = true)
@PathVariable Long clientId,
@ApiParam(value = "ID tenant", required = true)
@RequestHeader("X-Tenant-ID") String tenantId) {
// Implementation
return ResponseEntity.ok(ApiResponse.success(clientData));
}
@PostMapping("/loans")
@ApiOperation(
value = "Create Loan Application",
notes = "Membuat aplikasi pinjaman baru",
response = LoanData.class
)
@ApiResponses(value = {
@ApiResponse(code = 201, message = "Aplikasi pinjaman berhasil dibuat"),
@ApiResponse(code = 400, message = "Data aplikasi tidak valid"),
@ApiResponse(code = 404, message = "Client atau produk pinjaman tidak ditemukan"),
@ApiResponse(code = 409, message = "Aplikasi pinjaman sudah ada")
})
public ResponseEntity<ApiResponse<LoanData>> createLoan(
@ApiParam(value = "Data aplikasi pinjaman", required = true)
@Valid @RequestBody CreateLoanRequest request,
@ApiParam(value = "ID tenant", required = true)
@RequestHeader("X-Tenant-ID") String tenantId) {
// Implementation
return ResponseEntity.ok(ApiResponse.success(loanData));
}
}
API Testing Framework
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
@TestMethodOrder(OrderAnnotation.class)
public class IntegrationApiTest {
@Autowired
private TestRestTemplate restTemplate;
@Autowired
private IntegrationConfig config;
private static final String API_KEY = "test-api-key";
private static final String TENANT_ID = "test-tenant";
@Test
@Order(1)
@DisplayName("Test Client Creation API")
public void testCreateClient() {
CreateClientRequest request = CreateClientRequest.builder()
.firstName("John")
.lastName("Doe")
.email("john.doe@example.com")
.mobileNumber("+1234567890")
.build();
HttpHeaders headers = createHeaders();
HttpEntity<CreateClientRequest> entity = new HttpEntity<>(request, headers);
ResponseEntity<ApiResponse<ClientData>> response = restTemplate.postForEntity(
"/api/v1/integration/clients", entity,
new ParameterizedTypeReference<ApiResponse<ClientData>>() {});
assertThat(response.getStatusCode()).isEqualTo(HttpStatus.CREATED);
assertThat(response.getBody()).isNotNull();
assertThat(response.getBody().isSuccess()).isTrue();
assertThat(response.getBody().getData()).isNotNull();
assertThat(response.getBody().getData().getFirstName()).isEqualTo("John");
testClientId = response.getBody().getData().getId();
}
@Test
@Order(2)
@DisplayName("Test Get Client API")
public void testGetClient() {
HttpHeaders headers = createHeaders();
HttpEntity<Void> entity = new HttpEntity<>(headers);
ResponseEntity<ApiResponse<ClientData>> response = restTemplate.exchange(
"/api/v1/integration/clients/{clientId}", HttpMethod.GET, entity,
new ParameterizedTypeReference<ApiResponse<ClientData>>() {},
testClientId);
assertThat(response.getStatusCode()).isEqualTo(HttpStatus.OK);
assertThat(response.getBody()).isNotNull();
assertThat(response.getBody().isSuccess()).isTrue();
assertThat(response.getBody().getData()).isNotNull();
}
@Test
@DisplayName("Test Rate Limiting")
public void testRateLimiting() {
HttpHeaders headers = createHeaders();
HttpEntity<Void> entity = new HttpEntity<>(headers);
// Make multiple requests to trigger rate limiting
List<ResponseEntity<ApiResponse>>> responses = new ArrayList<>();
for (int i = 0; i < 1010; i++) { // More than default limit
responses.add(restTemplate.exchange(
"/api/v1/integration/clients/1", HttpMethod.GET, entity,
new ParameterizedTypeReference<ApiResponse>() {}));
}
// Last requests should be rate limited
assertThat(responses.get(1009).getStatusCode()).isEqualTo(HttpStatus.TOO_MANY_REQUESTS);
}
@Test
@DisplayName("Test Batch Processing")
public void testBatchProcessing() {
List<CreateClientRequest> clients = new ArrayList<>();
for (int i = 0; i < 10; i++) {
clients.add(CreateClientRequest.builder()
.firstName("Test" + i)
.lastName("Client" + i)
.email("test" + i + "@example.com")
.mobileNumber("+123456789" + i)
.build());
}
BatchIntegrationRequest batchRequest = BatchIntegrationRequest.builder()
.resourceType("CLIENT")
.items(clients)
.tenantId(TENANT_ID)
.build();
HttpHeaders headers = createHeaders();
HttpEntity<BatchIntegrationRequest> entity = new HttpEntity<>(batchRequest, headers);
ResponseEntity<BatchApiResponse> response = restTemplate.postForEntity(
"/api/v1/integration/batch", entity, BatchApiResponse.class);
assertThat(response.getStatusCode()).isEqualTo(HttpStatus.OK);
assertThat(response.getBody()).isNotNull();
assertThat(response.getBody().getTotalProcessed()).isEqualTo(10);
}
@Test
@DisplayName("Test API Documentation Endpoints")
public void testApiDocumentation() {
// Test OpenAPI JSON
ResponseEntity<String> jsonResponse = restTemplate.getForEntity(
"/api/v1/integration/v3/api-docs", String.class);
assertThat(jsonResponse.getStatusCode()).isEqualTo(HttpStatus.OK);
assertThat(jsonResponse.getBody()).contains("openapi");
// Test Swagger UI
ResponseEntity<String> uiResponse = restTemplate.getForEntity(
"/api/v1/integration/swagger-ui/", String.class);
assertThat(uiResponse.getStatusCode()).isEqualTo(HttpStatus.OK);
assertThat(uiResponse.getBody()).contains("swagger");
}
private HttpHeaders createHeaders() {
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
headers.set("X-API-Key", API_KEY);
headers.set("X-Tenant-ID", TENANT_ID);
return headers;
}
}
Kesimpulan
Arsitektur integrasi API Apache Fineract menyediakan foundation yang komprehensif untuk berbagai kebutuhan integrasi:
Kelebihan Arsitektur Integrasi:
- Multi-Protocol Support: REST APIs, WebSocket, dan batch processing
- Flexible Data Transformation: Support untuk berbagai data formats dan mappings
- Robust Error Handling: Comprehensive error handling dengan retry mechanisms
- Rate Limiting & Quotas: Advanced rate limiting dan quota management
- Real-time Communication: WebSocket support untuk real-time updates
- Event-Driven Architecture: Message queue integration untuk loosely-coupled systems
Integration Patterns:
- RESTful Integration: Standard HTTP-based API integration
- Real-time Integration: WebSocket connections untuk live updates
- Batch Integration: Bulk data import/export operations
- Event-Driven Integration: Message queue based integration
- File-Based Integration: CSV, Excel, JSON file processing
- Webhook Integration: Push notifications ke external systems
Security Features:
- API Key Authentication: Secure API key management
- Tenant Isolation: Multi-tenant access control
- Rate Limiting: Prevent API abuse
- Request Validation: Comprehensive input validation
- Audit Logging: Complete integration audit trail
Performance Optimizations:
- Connection Pooling: Efficient HTTP connection management
- Caching: Response caching untuk frequent requests
- Batch Processing: Optimized bulk operations
- Async Processing: Non-blocking integration operations
- Queue-based Processing: Decoupled message processing
Monitoring & Observability:
- Integration Metrics: Track integration performance
- Error Tracking: Comprehensive error logging
- Usage Analytics: API usage statistics
- Health Checks: Integration system monitoring
- Alerting: Real-time failure notifications
Arsitektur ini memastikan bahwa Apache Fineract dapat terintegrasi dengan berbagai sistem eksternal secara reliable, secure, dan performant.
Dokumentasi ini menjelaskan implementasi integrasi API secara detail. Specific configurations dapat disesuaikan berdasarkan requirements integrasi dan sistem eksternal yang digunakan.