Search results for: calibration data requirements
26946 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises
Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto
Abstract:
The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel
Procedia PDF Downloads 35626945 Astronomical Object Classification
Authors: Alina Muradyan, Lina Babayan, Arsen Nanyan, Gohar Galstyan, Vigen Khachatryan
Abstract:
We present a photometric method for identifying stars, galaxies and quasars in multi-color surveys, which uses a library of ∼> 65000 color templates for comparison with observed objects. The method aims for extracting the information content of object colors in a statistically correct way, and performs a classification as well as a redshift estimation for galaxies and quasars in a unified approach based on the same probability density functions. For the redshift estimation, we employ an advanced version of the Minimum Error Variance estimator which determines the redshift error from the redshift dependent probability density function itself. The method was originally developed for the Calar Alto Deep Imaging Survey (CADIS), but is now used in a wide variety of survey projects. We checked its performance by spectroscopy of CADIS objects, where the method provides high reliability (6 errors among 151 objects with R < 24), especially for the quasar selection, and redshifts accurate within σz ≈ 0.03 for galaxies and σz ≈ 0.1 for quasars. For an optimization of future survey efforts, a few model surveys are compared, which are designed to use the same total amount of telescope time but different sets of broad-band and medium-band filters. Their performance is investigated by Monte-Carlo simulations as well as by analytic evaluation in terms of classification and redshift estimation. If photon noise were the only error source, broad-band surveys and medium-band surveys should perform equally well, as long as they provide the same spectral coverage. In practice, medium-band surveys show superior performance due to their higher tolerance for calibration errors and cosmic variance. Finally, we discuss the relevance of color calibration and derive important conclusions for the issues of library design and choice of filters. The calibration accuracy poses strong constraints on an accurate classification, which are most critical for surveys with few, broad and deeply exposed filters, but less severe for surveys with many, narrow and less deep filters.Keywords: VO, ArVO, DFBS, FITS, image processing, data analysis
Procedia PDF Downloads 7826944 How to Perform Proper Indexing?
Authors: Watheq Mansour, Waleed Bin Owais, Mohammad Basheer Kotit, Khaled Khan
Abstract:
Efficient query processing is one of the utmost requisites in any business environment to satisfy consumer needs. This paper investigates the various types of indexing models, viz. primary, secondary, and multi-level. The investigation is done under the ambit of various types of queries to which each indexing model performs with efficacy. This study also discusses the inherent advantages and disadvantages of each indexing model and how indexing models can be chosen based on a particular environment. This paper also draws parallels between various indexing models and provides recommendations that would help a Database administrator to zero-in on a particular indexing model attributed to the needs and requirements of the production environment. In addition, to satisfy industry and consumer needs attributed to the colossal data generation nowadays, this study has proposed two novel indexing techniques that can be used to index highly unstructured and structured Big Data with efficacy. The study also briefly discusses some best practices that the industry should follow in order to choose an indexing model that is apposite to their prerequisites and requirements.Keywords: indexing, hashing, latent semantic indexing, B-tree
Procedia PDF Downloads 15626943 Assessment of Pre-Processing Influence on Near-Infrared Spectra for Predicting the Mechanical Properties of Wood
Authors: Aasheesh Raturi, Vimal Kothiyal, P. D. Semalty
Abstract:
We studied mechanical properties of Eucalyptus tereticornis using FT-NIR spectroscopy. Firstly, spectra were pre-processed to eliminate useless information. Then, prediction model was constructed by partial least squares regression. To study the influence of pre-processing on prediction of mechanical properties for NIR analysis of wood samples, we applied various pretreatment methods like straight line subtraction, constant offset elimination, vector-normalization, min-max normalization, multiple scattering. Correction, first derivative, second derivatives and their combination with other treatment such as First derivative + straight line subtraction, First derivative+ vector normalization and First derivative+ multiplicative scattering correction. The data processing methods in combination of preprocessing with different NIR regions, RMSECV, RMSEP and optimum factors/rank were obtained by optimization process of model development. More than 350 combinations were obtained during optimization process. More than one pre-processing method gave good calibration/cross-validation and prediction/test models, but only the best calibration/cross-validation and prediction/test models are reported here. The results show that one can safely use NIR region between 4000 to 7500 cm-1 with straight line subtraction, constant offset elimination, first derivative and second derivative preprocessing method which were found to be most appropriate for models development.Keywords: FT-NIR, mechanical properties, pre-processing, PLS
Procedia PDF Downloads 35926942 X-Ray Dosimetry by a Low-Cost Current Mode Ion Chamber
Authors: Ava Zarif Sanayei, Mustafa Farjad-Fard, Mohammad-Reza Mohammadian-Behbahani, Leyli Ebrahimi, Sedigheh Sina
Abstract:
The fabrication and testing of a low-cost air-filled ion chamber for X-ray dosimetry is studied. The chamber is made of a metal cylinder, a central wire, a BC517 Darlington transistor, a 9V DC battery, and a voltmeter in order to have a cost-effective means to measure the dose. The output current of the dosimeter is amplified by the transistor and then fed to the large internal resistance of the voltmeter, producing a readable voltage signal. The dose-response linearity of the ion chamber is evaluated for different exposure scenarios by the X-ray tube. kVp values 70, 90, and 120, and mAs up to 20 are considered. In all experiments, a solid-state dosimeter (Solidose 400, Elimpex Medizintechnik) is used as a reference device for chamber calibration. Each case of exposure is repeated three times, the voltmeter and Solidose readings are recorded, and the mean and standard deviation values are calculated. Then, the calibration curve, derived by plotting voltmeter readings against Solidose readings, provided a linear fit result for all tube kVps of 70, 90, and 120. A 99, 98, and 100% linear relationship, respectively, for kVp values 70, 90, and 120 are demonstrated. The study shows the feasibility of achieving acceptable dose measurements with a simplified setup. Further enhancements to the proposed setup include solutions for limiting the leakage current, optimizing chamber dimensions, utilizing electronic microcontrollers for dedicated data readout, and minimizing the impact of stray electromagnetic fields on the system.Keywords: dosimetry, ion chamber, radiation detection, X-ray
Procedia PDF Downloads 7726941 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models
Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo De Magalhães
Abstract:
This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.Keywords: rainfall-runoff models, automatic calibration, hyperbolic smoothing method
Procedia PDF Downloads 14926940 Evaluation of Different Liquid Scintillation Counting Methods for 222Rn Determination in Waters
Authors: Jovana Nikolov, Natasa Todorovic, Ivana Stojkovic
Abstract:
Monitoring of 222Rn in drinking or surface waters, as well as in groundwater has been performed in connection with geological, hydrogeological and hydrological surveys and health hazard studies. Liquid scintillation counting (LSC) is often preferred analytical method for 222Rn measurements in waters because it allows multiple-sample automatic analysis. LSC method implies mixing of water samples with organic scintillation cocktail, which triggers radon diffusion from the aqueous into organic phase for which it has a much greater affinity, eliminating possibility of radon emanation in that manner. Two direct LSC methods that assume different sample composition have been presented, optimized and evaluated in this study. One-phase method assumed direct mixing of 10 ml sample with 10 ml of emulsifying cocktail (Ultima Gold AB scintillation cocktail is used). Two-phase method involved usage of water-immiscible cocktails (in this study High Efficiency Mineral Oil Scintillator, Opti-Fluor O and Ultima Gold F are used). Calibration samples were prepared with aqueous 226Ra standard in glass 20 ml vials and counted on ultra-low background spectrometer Quantulus 1220TM equipped with PSA (Pulse Shape Analysis) circuit which discriminates alpha/beta spectra. Since calibration procedure is carried out with 226Ra standard, which has both alpha and beta progenies, it is clear that PSA discriminator has vital importance in order to provide reliable and precise spectra separation. Consequentially, calibration procedure was done through investigation of PSA discriminator level influence on 222Rn efficiency detection, using 226Ra calibration standard in wide range of activity concentrations. Evaluation of presented methods was based on obtained efficiency detections and achieved Minimal Detectable Activity (MDA). Comparison of presented methods, accuracy and precision as well as different scintillation cocktail’s performance was considered from results of measurements of 226Ra spiked water samples with known activity and environmental samples.Keywords: 222Rn in water, Quantulus1220TM, scintillation cocktail, PSA parameter
Procedia PDF Downloads 20126939 A Study of Blockchain Oracles
Authors: Abdeljalil Beniiche
Abstract:
The limitation with smart contracts is that they cannot access external data that might be required to control the execution of business logic. Oracles can be used to provide external data to smart contracts. An oracle is an interface that delivers data from external data outside the blockchain to a smart contract to consume. Oracle can deliver different types of data depending on the industry and requirements. In this paper, we study and describe the widely used blockchain oracles. Then, we elaborate on his potential role, technical architecture, and design patterns. Finally, we discuss the human oracle and its key role in solving the truth problem by reaching a consensus about a certain inquiry and tasks.Keywords: blockchain, oracles, oracles design, human oracles
Procedia PDF Downloads 13526938 Identification and Classification of Fiber-Fortified Semolina by Near-Infrared Spectroscopy (NIR)
Authors: Amanda T. Badaró, Douglas F. Barbin, Sofia T. Garcia, Maria Teresa P. S. Clerici, Amanda R. Ferreira
Abstract:
Food fortification is the intentional addition of a nutrient in a food matrix and has been widely used to overcome the lack of nutrients in the diet or increasing the nutritional value of food. Fortified food must meet the demand of the population, taking into account their habits and risks that these foods may cause. Wheat and its by-products, such as semolina, has been strongly indicated to be used as a food vehicle since it is widely consumed and used in the production of other foods. These products have been strategically used to add some nutrients, such as fibers. Methods of analysis and quantification of these kinds of components are destructive and require lengthy sample preparation and analysis. Therefore, the industry has searched for faster and less invasive methods, such as Near-Infrared Spectroscopy (NIR). NIR is a rapid and cost-effective method, however, it is based on indirect measurements, yielding high amount of data. Therefore, NIR spectroscopy requires calibration with mathematical and statistical tools (Chemometrics) to extract analytical information from the corresponding spectra, as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). PCA is well suited for NIR, once it can handle many spectra at a time and be used for non-supervised classification. Advantages of the PCA, which is also a data reduction technique, is that it reduces the data spectra to a smaller number of latent variables for further interpretation. On the other hand, LDA is a supervised method that searches the Canonical Variables (CV) with the maximum separation among different categories. In LDA, the first CV is the direction of maximum ratio between inter and intra-class variances. The present work used a portable infrared spectrometer (NIR) for identification and classification of pure and fiber-fortified semolina samples. The fiber was added to semolina in two different concentrations, and after the spectra acquisition, the data was used for PCA and LDA to identify and discriminate the samples. The results showed that NIR spectroscopy associate to PCA was very effective in identifying pure and fiber-fortified semolina. Additionally, the classification range of the samples using LDA was between 78.3% and 95% for calibration and 75% and 95% for cross-validation. Thus, after the multivariate analysis such as PCA and LDA, it was possible to verify that NIR associated to chemometric methods is able to identify and classify the different samples in a fast and non-destructive way.Keywords: Chemometrics, fiber, linear discriminant analysis, near-infrared spectroscopy, principal component analysis, semolina
Procedia PDF Downloads 21226937 Near-Infrared Spectrometry as an Alternative Method for Determination of Oxidation Stability for Biodiesel
Authors: R. Velvarska, A. Vrablik, M. Fiedlerova, R. Cerny
Abstract:
Near-infrared spectrometry (NIR) was tested as a rapid and alternative tool for determination of biodiesel oxidation stability. A PetroOxy method is standardly used for the determination, but this method is hazardous due to the possibility of explosion and ignition of flammable fuels. The second disadvantage is time consuming. The near-infrared spectrometry served for the development of the calibration model which was composed of 133 real samples (calibration standards). The reference values of these standards were obtained by PetroOxy method. Many chemometric diagnostics were used for the development of the final NIR model with the aim to have accurate prediction of the oxidation stability. The final NIR model was validated by 30 validation standards. The repeatability was determined as well with the acceptable residual standard deviation (8.59 %). The NIR spectrometry has proved to be an accurate alternative method for the determination of biodiesel oxidation stability with advantages as the time and cost saving, non-destructive character of analyzing and the possibility of online monitoring in safe mode.Keywords: biodiesel, fatty acid methyl ester, NIR, oxidation stability
Procedia PDF Downloads 17526936 A Six-Year Case Study Evaluating the Stakeholders’ Requirements and Satisfaction in Higher Educational Establishments
Authors: Ioannis I. Αngeli
Abstract:
Worldwide and mainly in the European Union, many standards, regulations, models and systems exists for the evaluation and identification of stakeholders’ requirements of individual universities and higher education (HE) in general. All systems are targeting to measure or evaluate the Universities’ Quality Assurance Systems and the services offered to the recipients of HE, mainly the students. Numerous surveys were conducted in the past either by each university or by organized bodies to identify the students’ satisfaction or to evaluate to what extent these requirements are fulfilled. In this paper, the main results of an ongoing 6-year joint research will be presented very briefly. This research deals with an in depth investigation of student’s satisfaction, students personal requirements, a cup analysis among these two parameters and compares different universities. Through this research an attempt will be made to address four very important questions in higher education establishments (HEE): (1) Are there any common requirements, parameters, good practices or questions that apply to a large number of universities that will assure that students’ requirements are fulfilled? (2) Up to what extent the individual programs of HEE fulfil the requirements of the stakeholders? (3) Are there any similarities on specific programs among European HEE? (4) To what extent the knowledge acquired in a specific course program is utilized or used in a specific country? For the execution of the research an internationally accepted questionnaire(s) was used to evaluate up to what extent the students’ requirements and satisfaction were fulfilled in 2012 and five years later (2017). Samples of students and or universities were taken from many European Universities. The questionnaires used, the sampling method and methodology adopted, as well as the comparison tables and results will be very valuable to any university that is willing to follow the same route and methodology or compare the results with their own HHE. Apart from the unique methodology, valuable results are demonstrated from the four case studies. There is a great difference between the student’s expectations or importance from what they are getting from their universities (in all parameters they are getting less). When there is a crisis or budget cut in HEE there is a direct impact to students. There are many differences on subjects taught in European universities.Keywords: quality in higher education, students' requirements, education standards, student's survey, stakeholder's requirements, mechanical engineering courses
Procedia PDF Downloads 15726935 Requirement Analysis for Emergency Management Software
Authors: Tomáš Ludík, Jiří Barta, Sabina Chytilová, Josef Navrátil
Abstract:
Emergency management is a discipline of dealing with and avoiding risks. Appropriate emergency management software allows better management of these risks and has a direct influence on reducing potential negative impacts. Although there are several emergency management software products in the Czech Republic, they cover user requirements from the emergency management field only partially. Therefore, the paper focuses on the issues of requirement analysis within development of emergency management software. Analysis of the current state describes the basic features and properties of user requirements for software development as well as basic methods and approaches for gathering these requirements. Then, the paper presents more specific mechanisms for requirement analysis based on chosen software development approach: structured, object-oriented or agile. Based on these experiences it is designed new methodology for requirement analysis. Methodology describes how to map user requirements comprehensively in the field of emergency management and thus reduce misunderstanding between software analyst and emergency manager. Proposed methodology was consulted with department of fire brigade and also has been applied in the requirements analysis for their current emergency management software. The proposed methodology has general character and can be used also in other specific areas during requirement analysis.Keywords: emergency software, methodology, requirement analysis, stakeholders, use case diagram, user stories
Procedia PDF Downloads 54026934 Development and Validation of First Derivative Method and Artificial Neural Network for Simultaneous Spectrophotometric Determination of Two Closely Related Antioxidant Nutraceuticals in Their Binary Mixture”
Authors: Mohamed Korany, Azza Gazy, Essam Khamis, Marwa Adel, Miranda Fawzy
Abstract:
Background: Two new, simple and specific methods; First, a Zero-crossing first-derivative technique and second, a chemometric-assisted spectrophotometric artificial neural network (ANN) were developed and validated in accordance with ICH guidelines. Both methods were used for the simultaneous estimation of the two closely related antioxidant nutraceuticals ; Coenzyme Q10 (Q) ; also known as Ubidecarenone or Ubiquinone-10, and Vitamin E (E); alpha-tocopherol acetate, in their pharmaceutical binary mixture. Results: For first method: By applying the first derivative, both Q and E were alternatively determined; each at the zero-crossing of the other. The D1 amplitudes of Q and E, at 285 nm and 235 nm respectively, were recorded and correlated to their concentrations. The calibration curve is linear over the concentration range of 10-60 and 5.6-70 μg mL-1 for Q and E, respectively. For second method: ANN (as a multivariate calibration method) was developed and applied for the simultaneous determination of both analytes. A training set (or a concentration set) of 90 different synthetic mixtures containing Q and E, in wide concentration ranges between 0-100 µg/mL and 0-556 µg/mL respectively, were prepared in ethanol. The absorption spectra of the training sets were recorded in the spectral region of 230–300 nm. A Gradient Descend Back Propagation ANN chemometric calibration was computed by relating the concentration sets (x-block) to their corresponding absorption data (y-block). Another set of 45 synthetic mixtures of the two drugs, in defined range, was used to validate the proposed network. Neither chemical separation, preparation stage nor mathematical graphical treatment were required. Conclusions: The proposed methods were successfully applied for the assay of Q and E in laboratory prepared mixtures and combined pharmaceutical tablet with excellent recoveries. The ANN method was superior over the derivative technique as the former determined both drugs in the non-linear experimental conditions. It also offers rapidity, high accuracy, effort and money saving. Moreover, no need for an analyst for its application. Although the ANN technique needed a large training set, it is the method of choice in the routine analysis of Q and E tablet. No interference was observed from common pharmaceutical additives. The results of the two methods were compared togetherKeywords: coenzyme Q10, vitamin E, chemometry, quantitative analysis, first derivative spectrophotometry, artificial neural network
Procedia PDF Downloads 44626933 Evaluating the Effect of Climate Change and Land Use/Cover Change on Catchment Hydrology of Gumara Watershed, Upper Blue Nile Basin, Ethiopia
Authors: Gashaw Gismu Chakilu
Abstract:
Climate and land cover change are very important issues in terms of global context and their responses to environmental and socio-economic drivers. The dynamic of these two factors is currently affecting the environment in unbalanced way including watershed hydrology. In this paper individual and combined impacts of climate change and land use land cover change on hydrological processes were evaluated through applying the model Soil and Water Assessment Tool (SWAT) in Gumara watershed, Upper Blue Nile basin Ethiopia. The regional climate; temperature and rainfall data of the past 40 years in the study area were prepared and changes were detected by using trend analysis applying Mann-Kendall trend test. The land use land cover data were obtained from land sat image and processed by ERDAS IMAGIN 2010 software. Three land use land cover data; 1973, 1986, and 2013 were prepared and these data were used for base line, model calibration and change study respectively. The effects of these changes on high flow and low flow of the catchment have also been evaluated separately. The high flow of the catchment for these two decades was analyzed by using Annual Maximum (AM) model and the low flow was evaluated by seven day sustained low flow model. Both temperature and rainfall showed increasing trend; and then the extent of changes were evaluated in terms of monthly bases by using two decadal time periods; 1973-1982 was taken as baseline and 2004-2013 was used as change study. The efficiency of the model was determined by Nash-Sutcliffe (NS) and Relative Volume error (RVe) and their values were 0.65 and 0.032 for calibration and 0.62 and 0.0051 for validation respectively. The impact of climate change was higher than that of land use land cover change on stream flow of the catchment; the flow has been increasing by 16.86% and 7.25% due to climate and LULC change respectively, and the combined change effect accounted 22.13% flow increment. The overall results of the study indicated that Climate change is more responsible for high flow than low flow; and reversely the land use land cover change showed more significant effect on low flow than high flow of the catchment. From the result we conclude that the hydrology of the catchment has been altered because of changes of climate and land cover of the study area.Keywords: climate, LULC, SWAT, Ethiopia
Procedia PDF Downloads 37526932 Effects of Changes in LULC on Hydrological Response in Upper Indus Basin
Authors: Ahmad Ammar, Umar Khan Khattak, Muhammad Majid
Abstract:
Empirically based lumped hydrologic models have an extensive track record of use for various watershed managements and flood related studies. This study focuses on the impacts of LULC change for 10 year period on the discharge in watershed using lumped model HEC-HMS. The Indus above Tarbela region acts as a source of the main flood events in the middle and lower portions of Indus because of the amount of rainfall and topographic setting of the region. The discharge pattern of the region is influenced by the LULC associated with it. In this study the Landsat TM images were used to do LULC analysis of the watershed. Satellite daily precipitation TRMM data was used as input rainfall. The input variables for model building in HEC-HMS were then calculated based on the GIS data collected and pre-processed in HEC-GeoHMS. SCS-CN was used as transform model, SCS unit hydrograph method was used as loss model and Muskingum was used as routing model. For discharge simulation years 2000 and 2010 were taken. HEC-HMS was calibrated for the year 2000 and then validated for 2010.The performance of the model was assessed through calibration and validation process and resulted R2=0.92 during calibration and validation. Relative Bias for the years 2000 was -9% and for2010 was -14%. The result shows that in 10 years the impact of LULC change on discharge has been negligible in the study area overall. One reason is that, the proportion of built-up area in the watershed, which is the main causative factor of change in discharge, is less than 1% of the total area. However, locally, the impact of development was found significant in built up area of Mansehra city. The analysis was done on Mansehra city sub-watershed with an area of about 16 km2 and has more than 13% built up area in 2010. The results showed that with an increase of 40% built-up area in the city from 2000 to 2010 the discharge values increased about 33 percent, indicating the impact of LULC change on discharge value.Keywords: LULC change, HEC-HMS, Indus Above Tarbela, SCS-CN
Procedia PDF Downloads 51226931 Digital Geomatics Trends for Production and Updating Topographic Map by Using Digital Generalization Procedures
Authors: O. Z. Jasim
Abstract:
An accuracy digital map must satisfy the users for two main requirements, first, map must be visually readable and second, all the map elements must be in a good representation. These two requirements hold especially true for map generalization which aims at simplifying the representation of cartographic data. Different scales of maps are very important for any decision in any maps with different scales such as master plan and all the infrastructures maps in civil engineering. Cartographer cannot project the data onto a piece of paper, but he has to worry about its readability. The map layout of any geodatabase is very important, this layout is help to read, analyze or extract information from the map. There are many principles and guidelines of generalization that can be find in the cartographic literature. A manual reduction method for generalization depends on experience of map maker and therefore produces incompatible results. Digital generalization, rooted from conventional cartography, has become an increasing concern in both Geographic Information System (GIS) and mapping fields. This project is intended to review the state of the art of the new technology and help to understand the needs and plans for the implementation of digital generalization capability as well as increase the knowledge of production topographic maps.Keywords: cartography, digital generalization, mapping, GIS
Procedia PDF Downloads 30426930 An Analysis of Privacy and Security for Internet of Things Applications
Authors: Dhananjay Singh, M. Abdullah-Al-Wadud
Abstract:
The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.Keywords: Internet of Things (IoT), message authentication, privacy, security
Procedia PDF Downloads 38226929 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison
Authors: Xiangtuo Chen, Paul-Henry Cournéde
Abstract:
Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest
Procedia PDF Downloads 23126928 Web Service Architectural Style Selection in Multi-Criteria Requirements
Authors: Ahmad Mohsin, Syda Fatima, Falak Nawaz, Aman Ullah Khan
Abstract:
Selection of an appropriate architectural style is vital to the success of target web service under development. The nature of architecture design and selection for service-oriented computing applications is quite different as compared to traditional software. Web Services have complex and rigorous architectural styles to choose. Due to this, selection for accurate architectural style for web services development has become a more complex decision to be made by architects. Architectural style selection is a multi-criteria decision and demands lots of experience in service oriented computing. Decision support systems are good solutions to simplify the selection process of a particular architectural style. Our research suggests a new approach using DSS for selection of architectural styles while developing a web service to cater FRs and NFRs. Our proposed DSS helps architects to select right web service architectural pattern according to the domain and non-functional requirements. In this paper, a rule base DSS has been developed using CLIPS (C Language Integrated Production System) to support decisions using multi-criteria requirements. This DSS takes architectural characteristics, domain requirements and software architect preferences for NFRs as input for different architectural styles in use today in service-oriented computing. Weighted sum model has been applied to prioritize quality attributes and domain requirements. Scores are calculated using multiple criterions to choose the final architecture style.Keywords: software architecture, web-service, rule-based, DSS, multi-criteria requirements, quality attributes
Procedia PDF Downloads 36426927 Emotional Artificial Intelligence and the Right to Privacy
Authors: Emine Akar
Abstract:
The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.Keywords: AI, privacy law, data protection, big data
Procedia PDF Downloads 8826926 The Egyptian eGovernment Journey
Authors: Ali Abdelsattar Elshabrawy
Abstract:
The Egyptian government is struggling to build it's eGovernment project. They succeeded to build the Egyptian digital portal, which contain links for number of services provided by different ministries. For achieving such success, their are requirements necessary to build such a project such as: internet dissemination, IT literacy, Strategy, disqualification of paper based services. This paper is going to clarify the main obstacles to the Egyptian eGovernment project from both the supply and demand sides. Also will clarify the most critical requirements in this phase of the project lifecycle. This paper should be in great value for the project team and also for many other developing countries that share the same obstacles.Keywords: the egyptian egovernment project lifecycle, supply side barriers, demand side barriers, egovernment project requirements
Procedia PDF Downloads 14726925 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 6826924 Provision Electronic Management Requirements in Libyan Oil Companies
Authors: Hitham Yami
Abstract:
This study will focus primarily on assessing the availability requirements of the electronic management of oil companies in Libya, and the mean objectives of the research applying electronic management and make recommendations and steps to approach electronic management. There are limited research and statistical analysis to support electronic management in Libyan companies. The groundwork for the proposed approach is to develop independent variables and the dependent variables to be restructured after it Alntra side of the field and the side to get the data to achieve the desired results and solving the problem faced by the Libyan Oil Corporation. All these strategies are proposed to achieve the goal, and solving Libyan oil installations.Keywords: oil company’s revenue, independent variables, electronic management, Libyan oil corporation
Procedia PDF Downloads 26426923 University Building: Discussion about the Effect of Numerical Modelling Assumptions for Occupant Behavior
Authors: Fabrizio Ascione, Martina Borrelli, Rosa Francesca De Masi, Silvia Ruggiero, Giuseppe Peter Vanoli
Abstract:
The refurbishment of public buildings is one of the key factors of energy efficiency policy of European States. Educational buildings account for the largest share of the oldest edifice with interesting potentialities for demonstrating best practice with regards to high performance and low and zero-carbon design and for becoming exemplar cases within the community. In this context, this paper discusses the critical issue of dealing the energy refurbishment of a university building in heating dominated climate of South Italy. More in detail, the importance of using validated models will be examined exhaustively by proposing an analysis on uncertainties due to modelling assumptions mainly referring to the adoption of stochastic schedules for occupant behavior and equipment or lighting usage. Indeed, today, the great part of commercial tools provides to designers a library of possible schedules with which thermal zones can be described. Very often, the users do not pay close attention to diversify thermal zones and to modify or to adapt predefined profiles, and results of designing are affected positively or negatively without any alarm about it. Data such as occupancy schedules, internal loads and the interaction between people and windows or plant systems, represent some of the largest variables during the energy modelling and to understand calibration results. This is mainly due to the adoption of discrete standardized and conventional schedules with important consequences on the prevision of the energy consumptions. The problem is surely difficult to examine and to solve. In this paper, a sensitivity analysis is presented, to understand what is the order of magnitude of error that is committed by varying the deterministic schedules used for occupation, internal load, and lighting system. This could be a typical uncertainty for a case study as the presented one where there is not a regulation system for the HVAC system thus the occupant cannot interact with it. More in detail, starting from adopted schedules, created according to questioner’ s responses and that has allowed a good calibration of energy simulation model, several different scenarios are tested. Two type of analysis are presented: the reference building is compared with these scenarios in term of percentage difference on the projected total electric energy need and natural gas request. Then the different entries of consumption are analyzed and for more interesting cases also the comparison between calibration indexes. Moreover, for the optimal refurbishment solution, the same simulations are done. The variation on the provision of energy saving and global cost reduction is evidenced. This parametric study wants to underline the effect on performance indexes evaluation of the modelling assumptions during the description of thermal zones.Keywords: energy simulation, modelling calibration, occupant behavior, university building
Procedia PDF Downloads 14026922 Early Requirement Engineering for Design of Learner Centric Dynamic LMS
Authors: Kausik Halder, Nabendu Chaki, Ranjan Dasgupta
Abstract:
We present a modelling framework that supports the engineering of early requirements specifications for design of learner centric dynamic Learning Management System. The framework is based on i* modelling tool and Means End Analysis, that adopts primitive concepts for modelling early requirements (such as actor, goal, and strategic dependency). We show how pedagogical and computational requirements for designing a learner centric Learning Management system can be adapted for the automatic early requirement engineering specifications. Finally, we presented a model on a Learner Quanta based adaptive Courseware. Our early requirement analysis shows that how means end analysis reveals gaps and inconsistencies in early requirements specifications that are by no means trivial to discover without the help of formal analysis tool.Keywords: adaptive courseware, early requirement engineering, means end analysis, organizational modelling, requirement modelling
Procedia PDF Downloads 50026921 Adjustment and Compensation Techniques for the Rotary Axes of Five-axis CNC Machine Tools
Authors: Tung-Hui Hsu, Wen-Yuh Jywe
Abstract:
Five-axis computer numerical control (CNC) machine tools (three linear and two rotary axes) are ideally suited to the fabrication of complex work pieces, such as dies, turbo blades, and cams. The locations of the axis average line and centerline of the rotary axes strongly influence the performance of these machines; however, techniques to compensate for eccentric error in the rotary axes remain weak. This paper proposes optical (Non-Bar) techniques capable of calibrating five-axis CNC machine tools and compensating for eccentric error in the rotary axes. This approach employs the measurement path in ISO/CD 10791-6 to determine the eccentric error in two rotary axes, for which compensatory measures can be implemented. Experimental results demonstrate that the proposed techniques can improve the performance of various five-axis CNC machine tools by more than 90%. Finally, a result of the cutting test using a B-type five-axis CNC machine tool confirmed to the usefulness of this proposed compensation technique.Keywords: calibration, compensation, rotary axis, five-axis computer numerical control (CNC) machine tools, eccentric error, optical calibration system, ISO/CD 10791-6
Procedia PDF Downloads 38326920 Conducting Quality Planning, Assurance and Control According to GMP (Good Manufacturing Practices) Standards and Benchmarking Data for Kuwait Food Industries
Authors: Alaa Alateeqi, Sara Aldhulaiee, Sara Alibraheem, Noura Alsaleh
Abstract:
For the past few decades or so, Kuwait's local food industry has grown remarkably due to increase in demand for processed or semi processed food products in the market. It is important that the ever increasing food manufacturing/processing units maintain the required quality standards as per regional and to some extent international quality requirements. It has been realized that all Kuwait food manufacturing units should understand and follow the international standard practices, and moreover a set of guidelines must be set for quality assurance such that any new business in this area is aware of the minimum requirements. The current study has been undertaken to identify the gaps in Kuwait food industries in following the Good Manufacturing Practices (GMP) in terms of quality planning, control and quality assurance. GMP refers to Good Manufacturing Practices, which are a set of rules, laws or regulations that certify producing products within quality standards and ensuring that it is safe, pure and effective. The present study therefore reports about a ‘case study’ in a reputed food manufacturing unit in Kuwait; starting from assessment of the current practices followed by diagnosis, report of the diagnosis and road map and corrective measures for GMP implementation in the unit. The case study has also been able to identify the best practices and establish a benchmarking data for other companies to follow, through measuring the selected company's quality, policies, products and strategies and compare it with the established benchmarking data. A set of questionnaires and assessment mechanism has been established for companies to identify their ‘benchmarking score’ in relation to the number of non-conformities and conformities with the GMP standard requirements.Keywords: good manufacturing practices, GMP, benchmarking, Kuwait Food Industries, food quality
Procedia PDF Downloads 46626919 Emerging Technology for 6G Networks
Authors: Yaseein S. Hussein, Victor P. Gil Jiménez, Abdulmajeed Al-Jumaily
Abstract:
Due to the rapid advancement of technology, there is an increasing demand for wireless connections that are both fast and reliable, with minimal latency. New wireless communication standards are developed every decade, and the year 2030 is expected to see the introduction of 6G. The primary objectives of 6G network and terminal designs are focused on sustainability and environmental friendliness. The International Telecommunication Union-Recommendation division (ITU-R) has established the minimum requirements for 6G, with peak and user data rates of 1 Tbps and 10-100 Gbps, respectively. In this context, Light Fidelity (Li-Fi) technology is the most promising candidate to meet these requirements. This article will explore the various advantages, features, and potential applications of Li-Fi technology, and compare it with 5G networking, to showcase its potential impact among other emerging technologies that aim to enable 6G networks.Keywords: 6G networks, artificial intelligence (AI), Li-Fi technology, Terahertz (THz) communication, visible light communication (VLC)
Procedia PDF Downloads 9426918 Challenges in Anti-Counterfeiting of Cyber-Physical Systems
Authors: Daniel Kliewe, Arno Kühn, Roman Dumitrescu, Jürgen Gausemeier
Abstract:
This paper examines the system protection for cyber-physical systems (CPS). CPS are particularly characterized by their networking system components. This means they are able to adapt to the needs of their users and its environment. With this ability, CPS have new, specific requirements on the protection against anti-counterfeiting, know-how loss and manipulation. They increase the requirements on system protection because piracy attacks can be more diverse, for example because of an increasing number of interfaces or through the networking abilities. The new requirements were identified and in a next step matched with existing protective measures. Due to the found gap the development of new protection measures has to be forced to close this gap. Moreover a comparison of the effectiveness between selected measures was realized and the first results are presented in the paper.Keywords: anti-counterfeiting, cyber physical systems, intellectual property (IP), knowledge management, system protection
Procedia PDF Downloads 49826917 Language and Study Skill Needs: A Case Study of ESP Learners at the Language Centre of Sultan Qaboos University, Oman
Authors: Ahmed Mohamed Al-Abdali
Abstract:
Providing English for Specific Purposes (ESP) courses that are more closely geared to the learners’ needs and requirements in their fields of study undoubtedly enhance learners’ interest and success in a highly academic environment. While needs analysis is crucial to the success of ESP courses, it has not received sufficient attention from researchers in the Arab world. Oman is no exception from the Arab countries as this fact is realised in the ESP practices in the Omani higher educational context. This presentation, however, discusses the perceptions of the Language Centre (LC) students at Sultan Qaboos University (SQU), Oman, in relation to the requirements of their science colleges. The discussion of the presentation will be based on a mixed-method-approach study, which included semi-structured interviews, questionnaires and document analyses. These mixed methods have allowed for closer investigation of the participants' views, backgrounds and experiences. It is hoped that the findings of this study will be used to recommend changes to the ESP curriculum in the LC of SQU so that it better meets the needs of its students and requirements of the science colleges.Keywords: curriculum, ESP, ELT, needs analysis, college requirements
Procedia PDF Downloads 322