Search results for: calibration data requirements
26976 Development and Validation of a Carbon Dioxide TDLAS Sensor for Studies on Fermented Dairy Products
Authors: Lorenzo Cocola, Massimo Fedel, Dragiša Savić, Bojana Danilović, Luca Poletto
Abstract:
An instrument for the detection and evaluation of gaseous carbon dioxide in the headspace of closed containers has been developed in the context of Packsensor Italian-Serbian joint project. The device is based on Tunable Diode Laser Absorption Spectroscopy (TDLAS) with a Wavelength Modulation Spectroscopy (WMS) technique in order to accomplish a non-invasive measurement inside closed containers of fermented dairy products (yogurts and fermented cheese in cups and bottles). The purpose of this instrument is the continuous monitoring of carbon dioxide concentration during incubation and storage of products over a time span of the whole shelf life of the product, in the presence of different microorganisms. The instrument’s optical front end has been designed to be integrated in a thermally stabilized incubator. An embedded computer provides processing of spectral artifacts and storage of an arbitrary set of calibration data allowing a properly calibrated measurement on many samples (cups and bottles) of different shapes and sizes commonly found in the retail distribution. A calibration protocol has been developed in order to be able to calibrate the instrument on the field also on containers which are notoriously difficult to seal properly. This calibration protocol is described and evaluated against reference measurements obtained through an industry standard (sampling) carbon dioxide metering technique. Some sets of validation test measurements on different containers are reported. Two test recordings of carbon dioxide concentration evolution are shown as an example of instrument operation. The first demonstrates the ability to monitor a rapid yeast growth in a contaminated sample through the increase of headspace carbon dioxide. Another experiment shows the dissolution transient with a non-saturated liquid medium in presence of a carbon dioxide rich headspace atmosphere.Keywords: TDLAS, carbon dioxide, cups, headspace, measurement
Procedia PDF Downloads 32426975 Cross-border Data Transfers to and from South Africa
Authors: Amy Gooden, Meshandren Naidoo
Abstract:
Genetic research and transfers of big data are not confined to a particular jurisdiction, but there is a lack of clarity regarding the legal requirements for importing and exporting such data. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa
Procedia PDF Downloads 12526974 Calibration of Contact Model Parameters and Analysis of Microscopic Behaviors of Cuxhaven Sand Using The Discrete Element Method
Authors: Anjali Uday, Yuting Wang, Andres Alfonso Pena Olare
Abstract:
The Discrete Element Method is a promising approach to modeling microscopic behaviors of granular materials. The quality of the simulations however depends on the model parameters utilized. The present study focuses on calibration and validation of the discrete element parameters for Cuxhaven sand based on the experimental data from triaxial and oedometer tests. A sensitivity analysis was conducted during the sample preparation stage and the shear stage of the triaxial tests. The influence of parameters like rolling resistance, inter-particle friction coefficient, confining pressure and effective modulus were investigated on the void ratio of the sample generated. During the shear stage, the effect of parameters like inter-particle friction coefficient, effective modulus, rolling resistance friction coefficient and normal-to-shear stiffness ratio are examined. The calibration of the parameters is carried out such that the simulations reproduce the macro mechanical characteristics like dilation angle, peak stress, and stiffness. The above-mentioned calibrated parameters are then validated by simulating an oedometer test on the sand. The oedometer test results are in good agreement with experiments, which proves the suitability of the calibrated parameters. In the next step, the calibrated and validated model parameters are applied to forecast the micromechanical behavior including the evolution of contact force chains, buckling of columns of particles, observation of non-coaxiality, and sample inhomogeneity during a simple shear test. The evolution of contact force chains vividly shows the distribution, and alignment of strong contact forces. The changes in coordination number are in good agreement with the volumetric strain exhibited during the simple shear test. The vertical inhomogeneity of void ratios is documented throughout the shearing phase, which shows looser structures in the top and bottom layers. Buckling of columns is not observed due to the small rolling resistance coefficient adopted for simulations. The non-coaxiality of principal stress and strain rate is also well captured. Thus the micromechanical behaviors are well described using the calibrated and validated material parameters.Keywords: discrete element model, parameter calibration, triaxial test, oedometer test, simple shear test
Procedia PDF Downloads 12026973 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries
Authors: Abdulrahman M. Qahtani, Gary. B. Wills, Andy. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.Keywords: customisation software products, global software engineering, local decision making, requirement engineering, simulation model
Procedia PDF Downloads 42926972 Simultaneous Determination of Six Characterizing/Quality Parameters of Biodiesels via 1H NMR and Multivariate Calibration
Authors: Gustavo G. Shimamoto, Matthieu Tubino
Abstract:
The characterization and the quality of biodiesel samples are checked by determining several parameters. Considering a large number of analysis to be performed, as well as the disadvantages of the use of toxic solvents and waste generation, multivariate calibration is suggested to reduce the number of tests. In this work, hydrogen nuclear magnetic resonance (1H NMR) spectra were used to build multivariate models, from partial least squares (PLS) regression, in order to determine simultaneously six important characterizing and/or quality parameters of biodiesels: density at 20 ºC, kinematic viscosity at 40 ºC, iodine value, acid number, oxidative stability, and water content. Biodiesels from twelve different oils sources were used in this study: babassu, brown flaxseed, canola, corn, cottonseed, macauba almond, microalgae, palm kernel, residual frying, sesame, soybean, and sunflower. 1H NMR reflects the structures of the compounds present in biodiesel samples and showed suitable correlations with the six parameters. The PLS models were constructed with latent variables between 5 and 7, the obtained values of r(cal) and r(val) were greater than 0.994 and 0.989, respectively. In addition, the models were considered suitable to predict all the six parameters for external samples, taking into account the analytical speed to perform it. Thus, the alliance between 1H NMR and PLS showed to be appropriate to characterize and evaluate the quality of biodiesels, reducing significantly analysis time, the consumption of reagents/solvents, and waste generation. Therefore, the proposed methods can be considered to adhere to the principles of green chemistry.Keywords: biodiesel, multivariate calibration, nuclear magnetic resonance, quality parameters
Procedia PDF Downloads 53926971 Estimation of Ribb Dam Catchment Sediment Yield and Reservoir Effective Life Using Soil and Water Assessment Tool Model and Empirical Methods
Authors: Getalem E. Haylia
Abstract:
The Ribb dam is one of the irrigation projects in the Upper Blue Nile basin, Ethiopia, to irrigate the Fogera plain. Reservoir sedimentation is a major problem because it reduces the useful reservoir capacity by the accumulation of sediments coming from the watersheds. Estimates of sediment yield are needed for studies of reservoir sedimentation and planning of soil and water conservation measures. The objective of this study was to simulate the Ribb dam catchment sediment yield using SWAT model and to estimate Ribb reservoir effective life according to trap efficiency methods. The Ribb dam catchment is found in North Western part of Ethiopia highlands, and it belongs to the upper Blue Nile and Lake Tana basins. Soil and Water Assessment Tool (SWAT) was selected to simulate flow and sediment yield in the Ribb dam catchment. The model sensitivity, calibration, and validation analysis at Ambo Bahir site were performed with Sequential Uncertainty Fitting (SUFI-2). The flow data at this site was obtained by transforming the Lower Ribb gauge station (2002-2013) flow data using Area Ratio Method. The sediment load was derived based on the sediment concentration yield curve of Ambo site. Stream flow results showed that the Nash-Sutcliffe efficiency coefficient (NSE) was 0.81 and the coefficient of determination (R²) was 0.86 in calibration period (2004-2010) and, 0.74 and 0.77 in validation period (2011-2013), respectively. Using the same periods, the NS and R² for the sediment load calibration were 0.85 and 0.79 and, for the validation, it became 0.83 and 0.78, respectively. The simulated average daily flow rate and sediment yield generated from Ribb dam watershed were 3.38 m³/s and 1772.96 tons/km²/yr, respectively. The effective life of Ribb reservoir was estimated using the developed empirical methods of the Brune (1953), Churchill (1948) and Brown (1958) methods and found to be 30, 38 and 29 years respectively. To conclude, massive sediment comes from the steep slope agricultural areas, and approximately 98-100% of this incoming annual sediment loads have been trapped by the Ribb reservoir. In Ribb catchment, as well as reservoir systematic and thorough consideration of technical, social, environmental, and catchment managements and practices should be made to lengthen the useful life of Ribb reservoir.Keywords: catchment, reservoir effective life, reservoir sedimentation, Ribb, sediment yield, SWAT model
Procedia PDF Downloads 18726970 A Hybrid MAC Protocol for Delay Constrained Mobile Wireless Sensor Networks
Authors: Hanefi Cinar, Musa Cibuk, Ismail Erturk, Fikri Aggun, Munip Geylani
Abstract:
Mobile Wireless Sensor Networks (MWSNs) carry heterogeneous data traffic with different urgency and quality of service (QoS) requirements. There are a lot of studies made on energy efficiency, bandwidth, and communication methods in literature. But delay, high throughput, utility parameters are not well considered. Increasing demand for real-time data transfer makes these parameters more important. In this paper we design new MAC protocol which is delay constrained and targets for improving delay, utility, and throughput performance of the network and finding solutions on collision and interference problems. Protocol improving QoS requirements by using TDMA, FDM, and OFDMA hybrid communication methods with multi-channel communication.Keywords: MWSN, delay, hybrid MAC, TDMA, FDM, OFDMA
Procedia PDF Downloads 48026969 Extended Kalman Filter and Markov Chain Monte Carlo Method for Uncertainty Estimation: Application to X-Ray Fluorescence Machine Calibration and Metal Testing
Authors: S. Bouhouche, R. Drai, J. Bast
Abstract:
This paper is concerned with a method for uncertainty evaluation of steel sample content using X-Ray Fluorescence method. The considered method of analysis is a comparative technique based on the X-Ray Fluorescence; the calibration step assumes the adequate chemical composition of metallic analyzed sample. It is proposed in this work a new combined approach using the Kalman Filter and Markov Chain Monte Carlo (MCMC) for uncertainty estimation of steel content analysis. The Kalman filter algorithm is extended to the model identification of the chemical analysis process using the main factors affecting the analysis results; in this case, the estimated states are reduced to the model parameters. The MCMC is a stochastic method that computes the statistical properties of the considered states such as the probability distribution function (PDF) according to the initial state and the target distribution using Monte Carlo simulation algorithm. Conventional approach is based on the linear correlation, the uncertainty budget is established for steel Mn(wt%), Cr(wt%), Ni(wt%) and Mo(wt%) content respectively. A comparative study between the conventional procedure and the proposed method is given. This kind of approaches is applied for constructing an accurate computing procedure of uncertainty measurement.Keywords: Kalman filter, Markov chain Monte Carlo, x-ray fluorescence calibration and testing, steel content measurement, uncertainty measurement
Procedia PDF Downloads 28326968 New Formula for Revenue Recognition Likely to Change the Prescription for Pharma Industry
Authors: Shruti Hajirnis
Abstract:
In May 2014, FASB issued Accounting Standards Update (ASU) 2014-09, Revenue from Contracts with Customers (Topic 606), and the International Accounting Standards Board (IASB) issued International Financial Reporting Standards (IFRS) 15, Revenue from Contracts with Customers that will supersede virtually all revenue recognition requirements in IFRS and US GAAP. FASB and the IASB have basically achieved convergence with these standards, with only some minor differences such as collectability threshold, interim disclosure requirements, early application and effective date, impairment loss reversal and nonpublic entity requirements. This paper discusses the impact of five-step model prescribed in new revenue standard on the entities operating in Pharma industry. It also outlines the considerations for these entities while implementing the new standard.Keywords: revenue recognition, pharma industry, standard, requirements
Procedia PDF Downloads 44426967 Python Implementation for S1000D Applicability Depended Processing Model - SALERNO
Authors: Theresia El Khoury, Georges Badr, Amir Hajjam El Hassani, Stéphane N’Guyen Van Ky
Abstract:
The widespread adoption of machine learning and artificial intelligence across different domains can be attributed to the digitization of data over several decades, resulting in vast amounts of data, types, and structures. Thus, data processing and preparation turn out to be a crucial stage. However, applying these techniques to S1000D standard-based data poses a challenge due to its complexity and the need to preserve logical information. This paper describes SALERNO, an S1000d AppLicability dEpended pRocessiNg mOdel. This python-based model analyzes and converts the XML S1000D-based files into an easier data format that can be used in machine learning techniques while preserving the different logic and relationships in files. The model parses the files in the given folder, filters them, and extracts the required information to be saved in appropriate data frames and Excel sheets. Its main idea is to group the extracted information by applicability. In addition, it extracts the full text by replacing internal and external references while maintaining the relationships between files, as well as the necessary requirements. The resulting files can then be saved in databases and used in different models. Documents in both English and French languages were tested, and special characters were decoded. Updates on the technical manuals were taken into consideration as well. The model was tested on different versions of the S1000D, and the results demonstrated its ability to effectively handle the applicability, requirements, references, and relationships across all files and on different levels.Keywords: aeronautics, big data, data processing, machine learning, S1000D
Procedia PDF Downloads 15626966 Optimized Road Lane Detection Through a Combined Canny Edge Detection, Hough Transform, and Scaleable Region Masking Toward Autonomous Driving
Authors: Samane Sharifi Monfared, Lavdie Rada
Abstract:
Nowadays, autonomous vehicles are developing rapidly toward facilitating human car driving. One of the main issues is road lane detection for a suitable guidance direction and car accident prevention. This paper aims to improve and optimize road line detection based on a combination of camera calibration, the Hough transform, and Canny edge detection. The video processing is implemented using the Open CV library with the novelty of having a scale able region masking. The aim of the study is to introduce automatic road lane detection techniques with the user’s minimum manual intervention.Keywords: hough transform, canny edge detection, optimisation, scaleable masking, camera calibration, improving the quality of image, image processing, video processing
Procedia PDF Downloads 9426965 Erosion Modeling of Surface Water Systems for Long Term Simulations
Authors: Devika Nair, Sean Bellairs, Ken Evans
Abstract:
Flow and erosion modeling provides an avenue for simulating the fine suspended sediment in surface water systems like streams and creeks. Fine suspended sediment is highly mobile, and many contaminants that may have been released by any sort of catchment disturbance attach themselves to these sediments. Therefore, a knowledge of fine suspended sediment transport is important in assessing contaminant transport. The CAESAR-Lisflood Landform Evolution Model, which includes a hydrologic model (TOPMODEL) and a hydraulic model (Lisflood), is being used to assess the sediment movement in tropical streams on account of a disturbance in the catchment of the creek and to determine the dynamics of sediment quantity in the creek through the years by simulating the model for future years. The accuracy of future simulations depends on the calibration and validation of the model to the past and present events. Calibration and validation of the model involve finding a combination of parameters of the model, which, when applied and simulated, gives model outputs similar to those observed for the real site scenario for corresponding input data. Calibrating the sediment output of the CAESAR-Lisflood model at the catchment level and using it for studying the equilibrium conditions of the landform is an area yet to be explored. Therefore, the aim of the study was to calibrate the CAESAR-Lisflood model and then validate it so that it could be run for future simulations to study how the landform evolves over time. To achieve this, the model was run for a rainfall event with a set of parameters, plus discharge and sediment data for the input point of the catchment, to analyze how similar the model output would behave when compared with the discharge and sediment data for the output point of the catchment. The model parameters were then adjusted until the model closely approximated the real site values of the catchment. It was then validated by running the model for a different set of events and checking that the model gave similar results to the real site values. The outcomes demonstrated that while the model can be calibrated to a greater extent for hydrology (discharge output) throughout the year, the sediment output calibration may be slightly improved by having the ability to change parameters to take into account the seasonal vegetation growth during the start and end of the wet season. This study is important to assess hydrology and sediment movement in seasonal biomes. The understanding of sediment-associated metal dispersion processes in rivers can be used in a practical way to help river basin managers more effectively control and remediate catchments affected by present and historical metal mining.Keywords: erosion modelling, fine suspended sediments, hydrology, surface water systems
Procedia PDF Downloads 8426964 Ecological Ice Hockey Butterfly Motion Assessment Using Inertial Measurement Unit Capture System
Authors: Y. Zhang, J. Perez, S. Marnier
Abstract:
To date, no study on goaltending butterfly motion has been completed in real conditions, during an ice hockey game or training practice, to the author's best knowledge. This motion, performed to save score, is unnatural, intense, and repeated. The target of this research activity is to identify representative biomechanical criteria for this goaltender-specific movement pattern. Determining specific physical parameters may allow to will identify the risk of hip and groin injuries sustained by goaltenders. Four professional or academic goalies were instrumented during ice hockey training practices with five inertial measurement units. These devices were inserted in dedicated pockets located on each thigh and shank, and the fifth on the lumbar spine. A camera was also installed close to the ice to observe and record the goaltenders' activities, especially the butterfly motions, in order to synchronize the captured data and the behavior of the goaltender. Each data recorded began with a calibration of the inertial units and a calibration of the fully equipped goaltender on the ice. Three butterfly motions were recorded out of the training practice to define referential individual butterfly motions. Then, a data processing algorithm based on the Madgwick filter computed hip and knee joints joint range of motion as well as angular specific angular velocities. The developed algorithm software automatically identified and analyzed all the butterfly motions executed by the four different goaltenders. To date, it is still too early to show that the analyzed criteria are representative of the trauma generated by the butterfly motion as the research is only at its beginning. However, this descriptive research activity is promising in its ecological assessment, and once the criteria are found, the tools and protocols defined will allow the prevention of as many injuries as possible. It will thus be possible to build a specific training program for each goalie.Keywords: biomechanics, butterfly motion, human motion analysis, ice hockey, inertial measurement unit
Procedia PDF Downloads 12526963 Going beyond Stakeholder Participation
Authors: Florian Engel
Abstract:
Only with a radical change to an intrinsically motivated project team, through giving the employees the freedom for autonomy, mastery and purpose, it is then possible to develop excellent products. With these changes, combined with using a rapid application development approach, the group of users serves as an important indicator to test the market needs, rather than only as the stakeholders for requirements.Keywords: intrinsic motivation, requirements elicitation, self-directed work, stakeholder participation
Procedia PDF Downloads 34226962 Defining a Holistic Approach for Model-Based System Engineering: Paradigm and Modeling Requirements
Authors: Hycham Aboutaleb, Bruno Monsuez
Abstract:
Current systems complexity has reached a degree that requires addressing conception and design issues while taking into account all the necessary aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponential growing effort, cost and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework and a environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and defines the refined functional as well as non functional requirements modeling tools needs to meet to be useful in model-based system engineering.Keywords: system modeling, modeling language, modeling requirements, framework
Procedia PDF Downloads 53126961 Prompt Design for Code Generation in Data Analysis Using Large Language Models
Authors: Lu Song Ma Li Zhi
Abstract:
With the rapid advancement of artificial intelligence technology, large language models (LLMs) have become a milestone in the field of natural language processing, demonstrating remarkable capabilities in semantic understanding, intelligent question answering, and text generation. These models are gradually penetrating various industries, particularly showcasing significant application potential in the data analysis domain. However, retraining or fine-tuning these models requires substantial computational resources and ample downstream task datasets, which poses a significant challenge for many enterprises and research institutions. Without modifying the internal parameters of the large models, prompt engineering techniques can rapidly adapt these models to new domains. This paper proposes a prompt design strategy aimed at leveraging the capabilities of large language models to automate the generation of data analysis code. By carefully designing prompts, data analysis requirements can be described in natural language, which the large language model can then understand and convert into executable data analysis code, thereby greatly enhancing the efficiency and convenience of data analysis. This strategy not only lowers the threshold for using large models but also significantly improves the accuracy and efficiency of data analysis. Our approach includes requirements for the precision of natural language descriptions, coverage of diverse data analysis needs, and mechanisms for immediate feedback and adjustment. Experimental results show that with this prompt design strategy, large language models perform exceptionally well in multiple data analysis tasks, generating high-quality code and significantly shortening the data analysis cycle. This method provides an efficient and convenient tool for the data analysis field and demonstrates the enormous potential of large language models in practical applications.Keywords: large language models, prompt design, data analysis, code generation
Procedia PDF Downloads 3826960 Modeling and Analysis of DFIG Based Wind Power System Using Instantaneous Power Components
Authors: Jaimala Ghambir, Tilak Thakur, Puneet Chawla
Abstract:
As per the statistical data, the Doubly-fed Induction Generator (DFIG) based wind turbine with variable speed and variable pitch control is the most common wind turbine in the growing wind market. This machine is usually used on the grid connected wind energy conversion system to satisfy grid code requirements such as grid stability, fault ride through (FRT), power quality improvement, grid synchronization and power control etc. Though the requirements are not fulfilled directly by the machine, the control strategy is used in both the stator as well as rotor side along with power electronic converters to fulfil the requirements stated above. To satisfy the grid code requirements of wind turbine, usually grid side converter is playing a major role. So in order to improve the operation capacity of wind turbine under critical situation, the intensive study of both machine side converter control and grid side converter control is necessary In this paper DFIG is modeled using power components as variables and the performance of the DFIG system is analysed under grid voltage fluctuations. The voltage fluctuations are made by lowering and raising the voltage values in the utility grid intentionally for the purpose of simulation keeping in view of different grid disturbances.Keywords: DFIG, dynamic modeling, DPC, sag, swell, voltage fluctuations, FRT
Procedia PDF Downloads 46226959 Electronic Raman Scattering Calibration for Quantitative Surface-Enhanced Raman Spectroscopy and Improved Biostatistical Analysis
Authors: Wonil Nam, Xiang Ren, Inyoung Kim, Masoud Agah, Wei Zhou
Abstract:
Despite its ultrasensitive detection capability, surface-enhanced Raman spectroscopy (SERS) faces challenges as a quantitative biochemical analysis tool due to the significant dependence of local field intensity in hotspots on nanoscale geometric variations of plasmonic nanostructures. Therefore, despite enormous progress in plasmonic nanoengineering of high-performance SERS devices, it is still challenging to quantitatively correlate the measured SERS signals with the actual molecule concentrations at hotspots. A significant effort has been devoted to developing SERS calibration methods by introducing internal standards. It has been achieved by placing Raman tags at plasmonic hotspots. Raman tags undergo similar SERS enhancement at the same hotspots, and ratiometric SERS signals for analytes of interest can be generated with reduced dependence on geometrical variations. However, using Raman tags still faces challenges for real-world applications, including spatial competition between the analyte and tags in hotspots, spectral interference, laser-induced degradation/desorption due to plasmon-enhanced photochemical/photothermal effects. We show that electronic Raman scattering (ERS) signals from metallic nanostructures at hotspots can serve as the internal calibration standard to enable quantitative SERS analysis and improve biostatistical analysis. We perform SERS with Au-SiO₂ multilayered metal-insulator-metal nano laminated plasmonic nanostructures. Since the ERS signal is proportional to the volume density of electron-hole occupation in hotspots, the ERS signals exponentially increase when the wavenumber is approaching the zero value. By a long-pass filter, generally used in backscattered SERS configurations, to chop the ERS background continuum, we can observe an ERS pseudo-peak, IERS. Both ERS and SERS processes experience the |E|⁴ local enhancements during the excitation and inelastic scattering transitions. We calibrated IMRS of 10 μM Rhodamine 6G in solution by IERS. The results show that ERS calibration generates a new analytical value, ISERS/IERS, insensitive to variations from different hotspots and thus can quantitatively reflect the molecular concentration information. Given the calibration capability of ERS signals, we performed label-free SERS analysis of living biological systems using four different breast normal and cancer cell lines cultured on nano-laminated SERS devices. 2D Raman mapping over 100 μm × 100 μm, containing several cells, was conducted. The SERS spectra were subsequently analyzed by multivariate analysis using partial least square discriminant analysis. Remarkably, after ERS calibration, MCF-10A and MCF-7 cells are further separated while the two triple-negative breast cancer cells (MDA-MB-231 and HCC-1806) are more overlapped, in good agreement with the well-known cancer categorization regarding the degree of malignancy. To assess the strength of ERS calibration, we further carried out a drug efficacy study using MDA-MB-231 and different concentrations of anti-cancer drug paclitaxel (PTX). After ERS calibration, we can more clearly segregate the control/low-dosage groups (0 and 1.5 nM), the middle-dosage group (5 nM), and the group treated with half-maximal inhibitory concentration (IC50, 15 nM). Therefore, we envision that ERS calibrated SERS can find crucial opportunities in label-free molecular profiling of complicated biological systems.Keywords: cancer cell drug efficacy, plasmonics, surface-enhanced Raman spectroscopy (SERS), SERS calibration
Procedia PDF Downloads 13726958 Specification of Requirements to Ensure Proper Implementation of Security Policies in Cloud-Based Multi-Tenant Systems
Authors: Rebecca Zahra, Joseph G. Vella, Ernest Cachia
Abstract:
The notion of cloud computing is rapidly gaining ground in the IT industry and is appealing mostly due to making computing more adaptable and expedient whilst diminishing the total cost of ownership. This paper focuses on the software as a service (SaaS) architecture of cloud computing which is used for the outsourcing of databases with their associated business processes. One approach for offering SaaS is basing the system’s architecture on multi-tenancy. Multi-tenancy allows multiple tenants (users) to make use of the same single application instance. Their requests and configurations might then differ according to specific requirements met through tenant customisation through the software. Despite the known advantages, companies still feel uneasy to opt for the multi-tenancy with data security being a principle concern. The fact that multiple tenants, possibly competitors, would have their data located on the same server process and share the same database tables heighten the fear of unauthorised access. Security is a vital aspect which needs to be considered by application developers, database administrators, data owners and end users. This is further complicated in cloud-based multi-tenant system where boundaries must be established between tenants and additional access control models must be in place to prevent unauthorised cross-tenant access to data. Moreover, when altering the database state, the transactions need to strictly adhere to the tenant’s known business processes. This paper focuses on the fact that security in cloud databases should not be considered as an isolated issue. Rather it should be included in the initial phases of the database design and monitored continuously throughout the whole development process. This paper aims to identify a number of the most common security risks and threats specifically in the area of multi-tenant cloud systems. Issues and bottlenecks relating to security risks in cloud databases are surveyed. Some techniques which might be utilised to overcome them are then listed and evaluated. After a description and evaluation of the main security threats, this paper produces a list of software requirements to ensure that proper security policies are implemented by a software development team when designing and implementing a multi-tenant based SaaS. This would then assist the cloud service providers to define, implement, and manage security policies as per tenant customisation requirements whilst assuring security for the customers’ data.Keywords: cloud computing, data management, multi-tenancy, requirements, security
Procedia PDF Downloads 15626957 Forecasting Materials Demand from Multi-Source Ordering
Authors: Hui Hsin Huang
Abstract:
The downstream manufactures will order their materials from different upstream suppliers to maintain a certain level of the demand. This paper proposes a bivariate model to portray this phenomenon of material demand. We use empirical data to estimate the parameters of model and evaluate the RMSD of model calibration. The results show that the model has better fitness.Keywords: recency, ordering time, materials demand quantity, multi-source ordering
Procedia PDF Downloads 53426956 Subpixel Corner Detection for Monocular Camera Linear Model Research
Authors: Guorong Sui, Xingwei Jia, Fei Tong, Xiumin Gao
Abstract:
Camera calibration is a fundamental issue of high precision noncontact measurement. And it is necessary to analyze and study the reliability and application range of its linear model which is often used in the camera calibration. According to the imaging features of monocular cameras, a camera model which is based on the image pixel coordinates and three dimensional space coordinates is built. Using our own customized template, the image pixel coordinate is obtained by the subpixel corner detection method. Without considering the aberration of the optical system, the feature extraction and linearity analysis of the line segment in the template are performed. Moreover, the experiment is repeated 11 times by constantly varying the measuring distance. At last, the linearity of the camera is achieved by fitting 11 groups of data. The camera model measurement results show that the relative error does not exceed 1%, and the repeated measurement error is not more than 0.1 mm magnitude. Meanwhile, it is found that the model has some measurement differences in the different region and object distance. The experiment results show this linear model is simple and practical, and have good linearity within a certain object distance. These experiment results provide a powerful basis for establishment of the linear model of camera. These works will have potential value to the actual engineering measurement.Keywords: camera linear model, geometric imaging relationship, image pixel coordinates, three dimensional space coordinates, sub-pixel corner detection
Procedia PDF Downloads 27726955 Stakeholder Mapping and Requirements Identification for Improving Traceability in the Halal Food Supply Chain
Authors: Laila A. H. F. Dashti, Tom Jackson, Andrew West, Lisa Jackson
Abstract:
Traceability systems are important in the agri-food and halal food sectors for monitoring ingredient movements, tracking sources, and ensuring food integrity. However, designing a traceability system for the halal food supply chain is challenging due to diverse stakeholder requirements and complex needs. Existing literature on stakeholder mapping and identifying requirements for halal food supply chains is limited. To address this gap, a pilot study was conducted to identify the objectives, requirements, and recommendations of stakeholders in the Kuwaiti halal food industry. The study collected data through semi-structured interviews with an international halal food manufacturer based in Kuwait. The aim was to gain a deep understanding of stakeholders' objectives, requirements, processes, and concerns related to the design of a traceability system in the country's halal food sector. Traceability systems are being developed and tested in the agri-food and halal food sectors due to their ability to monitor ingredient movements, track sources, and detect potential issues related to food integrity. Designing a traceability system for the halal food supply chain poses significant challenges due to diverse stakeholder requirements and the complexity of their needs (including varying food ingredients, different sources, destinations, supplier processes, certifications, etc.). Achieving a halal food traceability solution tailored to stakeholders' requirements within the supply chain necessitates prior knowledge of these needs. Although attempts have been made to address design-related issues in traceability systems, literature on stakeholder mapping and identification of requirements specific to halal food supply chains is scarce. Thus, this pilot study aims to identify the objectives, requirements, and recommendations of stakeholders in the halal food industry. The paper presents insights gained from the pilot study, which utilized semi-structured interviews to collect data from a Kuwait-based international halal food manufacturer. The objective was to gain an in-depth understanding of stakeholders' objectives, requirements, processes, and concerns pertaining to the design of a traceability system in Kuwait's halal food sector. The stakeholder mapping results revealed that government entities, food manufacturers, retailers, and suppliers are key stakeholders in Kuwait's halal food supply chain. Lessons learned from this pilot study regarding requirement capture for traceability systems include the need to streamline communication, focus on communication at each level of the supply chain, leverage innovative technologies to enhance process structuring and operations and reduce halal certification costs. The findings also emphasized the limitations of existing traceability solutions, such as limited cooperation and collaboration among stakeholders, high costs of implementing traceability systems without government support, lack of clarity regarding product routes, and disrupted communication channels between stakeholders. These findings contribute to a broader research program aimed at developing a stakeholder requirements framework that utilizes "business process modelling" to establish a unified model for traceable stakeholder requirements.Keywords: supply chain, traceability system, halal food, stakeholders’ requirements
Procedia PDF Downloads 11226954 A Framework on Data and Remote Sensing for Humanitarian Logistics
Authors: Vishnu Nagendra, Marten Van Der Veen, Stefania Giodini
Abstract:
Effective humanitarian logistics operations are a cornerstone in the success of disaster relief operations. However, for effectiveness, they need to be demand driven and supported by adequate data for prioritization. Without this data operations are carried out in an ad hoc manner and eventually become chaotic. The current availability of geospatial data helps in creating models for predictive damage and vulnerability assessment, which can be of great advantage to logisticians to gain an understanding on the nature and extent of the disaster damage. This translates into actionable information on the demand for relief goods, the state of the transport infrastructure and subsequently the priority areas for relief delivery. However, due to the unpredictable nature of disasters, the accuracy in the models need improvement which can be done using remote sensing data from UAVs (Unmanned Aerial Vehicles) or satellite imagery, which again come with certain limitations. This research addresses the need for a framework to combine data from different sources to support humanitarian logistic operations and prediction models. The focus is on developing a workflow to combine data from satellites and UAVs post a disaster strike. A three-step approach is followed: first, the data requirements for logistics activities are made explicit, which is done by carrying out semi-structured interviews with on field logistics workers. Second, the limitations in current data collection tools are analyzed to develop workaround solutions by following a systems design approach. Third, the data requirements and the developed workaround solutions are fit together towards a coherent workflow. The outcome of this research will provide a new method for logisticians to have immediately accurate and reliable data to support data-driven decision making.Keywords: unmanned aerial vehicles, damage prediction models, remote sensing, data driven decision making
Procedia PDF Downloads 37826953 A Survey on Requirements and Challenges of Internet Protocol Television Service over Software Defined Networking
Authors: Esmeralda Hysenbelliu
Abstract:
Over the last years, the demand for high bandwidth services, such as live (IPTV Service) and on-demand video streaming, steadily and rapidly increased. It has been predicted that video traffic (IPTV, VoD, and WEB TV) will account more than 90% of global Internet Protocol traffic that will cross the globe in 2016. Consequently, the importance and consideration on requirements and challenges of service providers faced today in supporting user’s requests for entertainment video across the various IPTV services through virtualization over Software Defined Networks (SDN), is tremendous in the highest stage of attention. What is necessarily required, is to deliver optimized live and on-demand services like Internet Protocol Service (IPTV Service) with low cost and good quality by strictly fulfill the essential requirements of Clients and ISP’s (Internet Service Provider’s) in the same time. The aim of this study is to present an overview of the important requirements and challenges of IPTV service with two network trends on solving challenges through virtualization (SDN and Network Function Virtualization). This paper provides an overview of researches published in the last five years.Keywords: challenges, IPTV service, requirements, software defined networking (SDN)
Procedia PDF Downloads 27126952 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack
Authors: Vincent Andrew Cappellano
Abstract:
In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.Keywords: architecture, resiliency, availability, cyber-attack
Procedia PDF Downloads 10826951 Immobilized Iron Oxide Nanoparticles for Stem Cell Reconstruction in Magnetic Particle Imaging
Authors: Kolja Them, Johannes Salamon, Harald Ittrich, Michael Kaul, Tobias Knopp
Abstract:
Superparamagnetic iron oxide nanoparticles (SPIONs) are nanoscale magnets which can be biologically functionalized for biomedical applications. Stem cell therapies to repair damaged tissue, magnetic fluid hyperthermia for cancer therapy and targeted drug delivery based on SPIONs are prominent examples where the visualization of a preferably low concentrated SPION distribution is essential. In 2005 a new method for tomographic SPION imaging has been introduced. The method named magnetic particle imaging (MPI) takes advantage of the nanoparticles magnetization change caused by an oscillating, external magnetic field and allows to directly image the time-dependent nanoparticle distribution. The SPION magnetization can be changed by the electron spin dynamics as well as by a mechanical rotation of the nanoparticle. In this work different calibration methods in MPI are investigated for image reconstruction of magnetically labeled stem cells. It is shown that a calibration using rotationally immobilized SPIONs provides a higher quality of stem cell images with fewer artifacts than a calibration using mobile SPIONs. The enhancement of the image quality and the reduction of artifacts enables the localization and identification of a smaller number of magnetically labeled stem cells. This is important for future medical applications where low concentrations of functionalized SPIONs interacting with biological matter have to be localized.Keywords: biomedical imaging, iron oxide nanoparticles, magnetic particle imaging, stem cell imaging
Procedia PDF Downloads 46426950 UBCSAND Model Calibration for Generic Liquefaction Triggering Curves
Authors: Jui-Ching Chou
Abstract:
Numerical simulation is a popular method used to evaluate the effects of soil liquefaction on a structure or the effectiveness of a mitigation plan. Many constitutive models (UBCSAND model, PM4 model, SANISAND model, etc.) were presented to model the liquefaction phenomenon. In general, inputs of a constitutive model need to be calibrated against the soil cyclic resistance before being applied to the numerical simulation model. Then, simulation results can be compared with results from simplified liquefaction potential assessing methods. In this article, inputs of the UBCSAND model, a simple elastic-plastic stress-strain model, are calibrated against several popular generic liquefaction triggering curves of simplified liquefaction potential assessing methods via FLAC program. Calibrated inputs can provide engineers to perform a preliminary evaluation of an existing structure or a new design project.Keywords: calibration, liquefaction, numerical simulation, UBCSAND Model
Procedia PDF Downloads 17326949 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges
Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars
Abstract:
In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting
Procedia PDF Downloads 15326948 A Survey of Attacks and Security Requirements in Wireless Sensor Networks
Authors: Vishnu Pratap Singh Kirar
Abstract:
Wireless sensor network (WSN) is a network of many interconnected networked systems, they equipped with energy resources and they are used to detect other physical characteristics. On WSN, there are many researches are performed in past decades. WSN applicable in many security systems govern by military and in many civilian related applications. Thus, the security of WSN gets attention of researchers and gives an opportunity for many future aspects. Still, there are many other issues are related to deployment and overall coverage, scalability, size, energy efficiency, quality of service (QoS), computational power and many more. In this paper we discus about various applications and security related issue and requirements of WSN.Keywords: wireless sensor network (WSN), wireless network attacks, wireless network security, security requirements
Procedia PDF Downloads 49126947 Development of a Combustible Gas Detector with Two Sensor Modules to Enable Measuring Range of Low Concentration
Authors: Young Gyu Kim, Sangguk Ahn, Gyoutae Park, Hiesik Kim
Abstract:
In the gas industrial fields, there are many problems to detect extremely small amounts of combustible gas (CH₄) if a conventional semiconductor is used. Those reasons are that measuring is difficult at the low concentration level, the stabilization time is long, and an initial response time is slow. In this study, we propose a method to solve these issues using two specific sensors to overcome the circumstances of temperature and humidity. This idea is to combine a catalytic and a semiconductor type sensor and to utilize every advantage from every sensor’s characteristic. In order to achieve the goal, we reduced fluctuations of a gas sensor for temperature and humidity by applying designed circuits for sensing temperature and humidity. And we induced the best calibration line of gas sensors through adjusting a weight value corresponding to changeable patterns of temperature and humidity after their data are previously acquired and stored. We proposed and developed the gas leak detector using two sensor modules, which is first operated by a semiconductor sensor for measuring small gas quantities and second a catalytic type sensor is detected if measuring range of the first sensor is beyond. We conclusively verified characteristics of sharp sensitivity and fast response time against even at lower gas concentration level through experiments other than a conventional gas sensor. We think that our proposed idea is very useful if another gas leak is developed to enable measuring extremely small quantities of toxic and flammable gases.Keywords: gas sensor, leak detector, lower concentration, and calibration
Procedia PDF Downloads 240