Search results for: Verification and Validation (V&V)
1046 Comparison of Classical Computer Vision vs. Convolutional Neural Networks Approaches for Weed Mapping in Aerial Images
Authors: Paulo Cesar Pereira Junior, Alexandre Monteiro, Rafael da Luz Ribeiro, Antonio Carlos Sobieranski, Aldo von Wangenheim
Abstract:
In this paper, we present a comparison between convolutional neural networks and classical computer vision approaches, for the specific precision agriculture problem of weed mapping on sugarcane fields aerial images. A systematic literature review was conducted to find which computer vision methods are being used on this specific problem. The most cited methods were implemented, as well as four models of convolutional neural networks. All implemented approaches were tested using the same dataset, and their results were quantitatively and qualitatively analyzed. The obtained results were compared to a human expert made ground truth for validation. The results indicate that the convolutional neural networks present better precision and generalize better than the classical models.Keywords: convolutional neural networks, deep learning, digital image processing, precision agriculture, semantic segmentation, unmanned aerial vehicles
Procedia PDF Downloads 2611045 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria
Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova
Abstract:
Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.Keywords: cross-validation, decision tree, lagged variables, short-term forecasting
Procedia PDF Downloads 1961044 Scoring System for the Prognosis of Sepsis Patients in Intensive Care Units
Authors: Javier E. García-Gallo, Nelson J. Fonseca-Ruiz, John F. Duitama-Munoz
Abstract:
Sepsis is a syndrome that occurs with physiological and biochemical abnormalities induced by severe infection and carries a high mortality and morbidity, therefore the severity of its condition must be interpreted quickly. After patient admission in an intensive care unit (ICU), it is necessary to synthesize the large volume of information that is collected from patients in a value that represents the severity of their condition. Traditional severity of illness scores seeks to be applicable to all patient populations, and usually assess in-hospital mortality. However, the use of machine learning techniques and the data of a population that shares a common characteristic could lead to the development of customized mortality prediction scores with better performance. This study presents the development of a score for the one-year mortality prediction of the patients that are admitted to an ICU with a sepsis diagnosis. 5650 ICU admissions extracted from the MIMICIII database were evaluated, divided into two groups: 70% to develop the score and 30% to validate it. Comorbidities, demographics and clinical information of the first 24 hours after the ICU admission were used to develop a mortality prediction score. LASSO (least absolute shrinkage and selection operator) and SGB (Stochastic Gradient Boosting) variable importance methodologies were used to select the set of variables that make up the developed score; each of this variables was dichotomized and a cut-off point that divides the population into two groups with different mean mortalities was found; if the patient is in the group that presents a higher mortality a one is assigned to the particular variable, otherwise a zero is assigned. These binary variables are used in a logistic regression (LR) model, and its coefficients were rounded to the nearest integer. The resulting integers are the point values that make up the score when multiplied with each binary variables and summed. The one-year mortality probability was estimated using the score as the only variable in a LR model. Predictive power of the score, was evaluated using the 1695 admissions of the validation subset obtaining an area under the receiver operating characteristic curve of 0.7528, which outperforms the results obtained with Sequential Organ Failure Assessment (SOFA), Oxford Acute Severity of Illness Score (OASIS) and Simplified Acute Physiology Score II (SAPSII) scores on the same validation subset. Observed and predicted mortality rates within estimated probabilities deciles were compared graphically and found to be similar, indicating that the risk estimate obtained with the score is close to the observed mortality, it is also observed that the number of events (deaths) is indeed increasing as the outcome go from the decile with the lowest probabilities to the decile with the highest probabilities. Sepsis is a syndrome that carries a high mortality, 43.3% for the patients included in this study; therefore, tools that help clinicians to quickly and accurately predict a worse prognosis are needed. This work demonstrates the importance of customization of mortality prediction scores since the developed score provides better performance than traditional scoring systems.Keywords: intensive care, logistic regression model, mortality prediction, sepsis, severity of illness, stochastic gradient boosting
Procedia PDF Downloads 2241043 Design and Implementation of a Geodatabase and WebGIS
Authors: Sajid Ali, Dietrich Schröder
Abstract:
The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application
Procedia PDF Downloads 3411042 'Low Electronic Noise' Detector Technology in Computed Tomography
Authors: A. Ikhlef
Abstract:
Image noise in computed tomography, is mainly caused by the statistical noise, system noise reconstruction algorithm filters. Since last few years, low dose x-ray imaging became more and more desired and looked as a technical differentiating technology among CT manufacturers. In order to achieve this goal, several technologies and techniques are being investigated, including both hardware (integrated electronics and photon counting) and software (artificial intelligence and machine learning) based solutions. From a hardware point of view, electronic noise could indeed be a potential driver for low and ultra-low dose imaging. We demonstrated that the reduction or elimination of this term could lead to a reduction of dose without affecting image quality. Also, in this study, we will show that we can achieve this goal using conventional electronics (low cost and affordable technology), designed carefully and optimized for maximum detective quantum efficiency. We have conducted the tests using large imaging objects such as 30 cm water and 43 cm polyethylene phantoms. We compared the image quality with conventional imaging protocols with radiation as low as 10 mAs (<< 1 mGy). Clinical validation of such results has been performed as well.Keywords: computed tomography, electronic noise, scintillation detector, x-ray detector
Procedia PDF Downloads 1271041 Investigation of Martensitic Transformation Zone at the Crack Tip of NiTi under Mode-I Loading Using Microscopic Image Correlation
Authors: Nima Shafaghi, Gunay Anlaş, C. Can Aydiner
Abstract:
A realistic understanding of martensitic phase transition under complex stress states is key for accurately describing the mechanical behavior of shape memory alloys (SMAs). Particularly regarding the sharply changing stress fields at the tip of a crack, the size, nature and shape of transformed zones are of great interest. There is significant variation among various analytical models in their predictions of the size and shape of the transformation zone. As the fully transformed region remains inside a very small boundary at the tip of the crack, experimental validation requires microscopic resolution. Here, the crack tip vicinity of NiTi compact tension specimen has been monitored in situ with microscopic image correlation with 20x magnification. With nominal 15 micrometer grains and 0.2 micrometer per pixel optical resolution, the strains at the crack tip are mapped with intra-grain detail. The transformation regions are then deduced using an equivalent strain formulation.Keywords: digital image correlation, fracture, martensitic phase transition, mode I, NiTi, transformation zone
Procedia PDF Downloads 3541040 A Qualitative Study Examining the Process of EFL Course Design from the Perspectives of Teachers
Authors: Iman Al Khalidi
Abstract:
Recently, English has become the language of globalization and technology. In turn, this has resulted in a seemingly bewildering array of influences and trends in the domain of TESOL curriculum. In light of these changes, higher education has to provide a new and more powerful kind of education. It should prepare students to be more engaged citizens, more capable to solve complex problems at work, and well prepared to lead meaningful life. In response to this, universities, colleges, schools, and departments have to work out in light of the requirements and challenges of the global and technological era. Consequently they have to focus on the adoption of contemporary curriculum which goes in line with the pedagogical shifts from teaching –centered approach to learning centered approach. Ideally, there has been noticeable emphasis on the crucial importance of developing and professionalizing teachers in order to engage them in the process of curriculum development and action research. This is a qualitative study that aims at understanding and exploring the process of designing EFL courses by teachers at the tertiary level from the perspectives of the participants in a professional context in TESOL, Department of English, a private college in Oman. It is a case study that stands on the philosophy of the qualitative approach. It employs multi methods for collecting qualitative data: semi-structured interviews with teachers, focus group discussions with students, and document analysis. The collected data have been analyzed qualitatively by adopting Miles and Huberman's Approach using procedures of reduction, coding, displaying and conclusion drawing and verification.Keywords: course design, components of course design, case study, data analysis
Procedia PDF Downloads 5461039 A Qualitative Study Examining the Process of Course Design from the Perspectives of Teachers
Authors: Iman Al Khalidi
Abstract:
Recently, English has become the language of globalization and technology. In turn, this has resulted in a seemingly bewildering array of influences and trends in the domain of TESOL curriculum. In light of these changes, higher education has to provide a new and more powerful kind of education. It should prepare students to be more engaged citizens, more capable to solve complex problems at work, and well prepared to lead a meaningful life. In response to this, universities, colleges, schools, and departments have to work out in light of the requirements and challenges of the global and technological era. Consequently, they have to focus on the adoption of contemporary curriculum which goes in line with the pedagogical shifts from teaching –centered approach to learning centered approach. Ideally, there has been noticeable emphasis on the crucial importance of developing and professionalizing teachers in order to engage them in the process of curriculum development and action research. This is a qualitative study that aims at understanding and exploring the process of designing EFL courses by teachers at the tertiary level from the perspectives of the participants in a professional context in TESOL, Department of English, a private college in Oman. It is a case study that stands on the philosophy of the qualitative approach. It employs multi-methods for collecting qualitative data: semi-structured interviews with teachers, focus group discussions with students, and document analysis. The collected data have been analyzed qualitatively by adopting Miles and Huberman's Approach using procedures of reduction, coding, displaying, and conclusion drawing and verification.Keywords: course design, components of course design, case study, data analysis
Procedia PDF Downloads 4421038 Functional Instruction Set Simulator of a Neural Network IP with Native Brain Float-16 Generator
Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula
Abstract:
A functional model to mimic the functional correctness of a neural network compute accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of GCC compilers to the BF-16 datatype, which we addressed with a native BF-16 generator integrated into our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex neural network accelerator design by proposing a functional model-based scoreboard or software model using SystemC. The proposed functional model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT, bringing up micro-steps of execution.Keywords: ISA, neural network, Brain Float-16, DUT
Procedia PDF Downloads 951037 Statistical Quality Control on Assignable Causes of Variation on Cement Production in Ashaka Cement PLC Gombe State
Authors: Hamisu Idi
Abstract:
The present study focuses on studying the impact of influencer recommendation in the quality of cement production. Exploratory research was done on monthly basis, where data were obtained from secondary source i.e. the record kept by an automated recompilation machine. The machine keeps all the records of the mills downtime which the process manager checks for validation and refer the fault (if any) to the department responsible for maintenance or measurement taking so as to prevent future occurrence. The findings indicated that the product of the Ashaka Cement Plc. were considered as qualitative, since all the production processes were found to be in control (preset specifications) with the exception of the natural cause of variation which is normal in the production process as it will not affect the outcome of the product. It is reduced to the bearest minimum since it cannot be totally eliminated. It is also hopeful that the findings of this study would be of great assistance to the management of Ashaka cement factory and the process manager in particular at various levels in the monitoring and implementation of statistical process control. This study is therefore of great contribution to the knowledge in this regard and it is hopeful that it would open more research in that direction.Keywords: cement, quality, variation, assignable cause, common cause
Procedia PDF Downloads 2641036 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms
Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang
Abstract:
Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.Keywords: bioassay, machine learning, preprocessing, virtual screen
Procedia PDF Downloads 2761035 Side Effects of COVID-19 Vaccine Investigated by Radiology
Authors: Mahdi Farajzadeh Ajirlou
Abstract:
The detailed serious adverse effects raised the stresses around the safety of individuals who have gotten COVID-19 vaccines. Numerous verification referrers that disease with COV-19 causes neurological dysfunction in a significant proportion of influenced patients, where these side effects show up seriously amid the disease, and still less is known approximately the potential long-term results for the brain, where the loss of olfaction could be a neurological sign and simple indications of COVID-19. Since publishing effective clinical trial results of mRNA coronavirus disease 2019 (COVID-19) and injecting it to the volunteers in 2020, numerous reports have emerged approximately about cardiovascular complications followed by the mRNA vaccination. Vaccination-associated adenopathy could be a constant imaging finding after the organization of COVID-19 antibodies that will lead to a symptomatic problem in patients with shown or suspected cancer, in whom it may be vague from dangerous nodal inclusion. In spite of all the benefits and viability of the coronavirus infection 2019 (COVID-19) antibodies specified in later clinical trials, a few other post-vaccination side impacts, such as lymphadenopathy (LAP), were observed. Also, numerous variables, including financial conditions, have played a critical part in expanding the number of people with COVID-19 infection and also much more side effects in that country. Amid the Coronavirus widespread, Iran has been experiencing extreme sanctions, which has faced this nation with an extreme financial crisis. Additionally, with COVID-19 widespread, there was a developing concern around the abuse of imaging exams extraordinarily within the pediatric populace, which highlights the issues pointed out by this review.Keywords: radiology, vaccines, COVID-19, side effect
Procedia PDF Downloads 641034 Digital Platform of Crops for Smart Agriculture
Authors: Pascal François Faye, Baye Mor Sall, Bineta Dembele, Jeanne Ana Awa Faye
Abstract:
In agriculture, estimating crop yields is key to improving productivity and decision-making processes such as financial market forecasting and addressing food security issues. The main objective of this paper is to have tools to predict and improve the accuracy of crop yield forecasts using machine learning (ML) algorithms such as CART , KNN and SVM . We developed a mobile app and a web app that uses these algorithms for practical use by farmers. The tests show that our system (collection and deployment architecture, web application and mobile application) is operational and validates empirical knowledge on agro-climatic parameters in addition to proactive decision-making support. The experimental results obtained on the agricultural data, the performance of the ML algorithms are compared using cross-validation in order to identify the most effective ones following the agricultural data. The proposed applications demonstrate that the proposed approach is effective in predicting crop yields and provides timely and accurate responses to farmers for decision support.Keywords: prediction, machine learning, artificial intelligence, digital agriculture
Procedia PDF Downloads 801033 Technology Maps in Energy Applications Based on Patent Trends: A Case Study
Authors: Juan David Sepulveda
Abstract:
This article reflects the current stage of progress in the project “Determining technological trends in energy generation”. At first it was oriented towards finding out those trends by employing such tools as the scientometrics community had proved and accepted as effective for getting reliable results. Because a documented methodological guide for this purpose could not be found, the decision was made to reorient the scope and aim of this project, changing the degree of interest in pursuing the objectives. Therefore it was decided to propose and implement a novel guide from the elements and techniques found in the available literature. This article begins by explaining the elements and considerations taken into account when implementing and applying this methodology, and the tools that led to the implementation of a software application for patent revision. Univariate analysis helped recognize the technological leaders in the field of energy, and steered the way for a multivariate analysis of this sample, which allowed for a graphical description of the techniques of mature technologies, as well as the detection of emerging technologies. This article ends with a validation of the methodology as applied to the case of fuel cells.Keywords: energy, technology mapping, patents, univariate analysis
Procedia PDF Downloads 4761032 Comparative Evaluation of EBT3 Film Dosimetry Using Flat Bad Scanner, Densitometer and Spectrophotometer Methods and Its Applications in Radiotherapy
Authors: K. Khaerunnisa, D. Ryangga, S. A. Pawiro
Abstract:
Over the past few decades, film dosimetry has become a tool which is used in various radiotherapy modalities, either for clinical quality assurance (QA) or dose verification. The response of the film to irradiation is usually expressed in optical density (OD) or net optical density (netOD). While the film's response to radiation is not linear, then the use of film as a dosimeter must go through a calibration process. This study aimed to compare the function of the calibration curve of various measurement methods with various densitometer, using a flat bad scanner, point densitometer and spectrophotometer. For every response function, a radichromic film calibration curve is generated from each method by performing accuracy, precision and sensitivity analysis. netOD is obtained by measuring changes in the optical density (OD) of the film before irradiation and after irradiation when using a film scanner if it uses ImageJ to extract the pixel value of the film on the red channel of three channels (RGB), calculate the change in OD before and after irradiation when using a point densitometer, and calculate changes in absorbance before and after irradiation when using a spectrophotometer. the results showed that the three calibration methods gave readings with a netOD precision of doses below 3% for the uncertainty value of 1σ (one sigma). while the sensitivity of all three methods has the same trend in responding to film readings against radiation, it has a different magnitude of sensitivity. while the accuracy of the three methods provides readings below 3% for doses above 100 cGy and 200 cGy, but for doses below 100 cGy found above 3% when using point densitometers and spectrophotometers. when all three methods are used for clinical implementation, the results of the study show accuracy and precision below 2% for the use of scanners and spectrophotometers and above 3% for precision and accuracy when using point densitometers.Keywords: Callibration Methods, Film Dosimetry EBT3, Flat Bad Scanner, Densitomete, Spectrophotometer
Procedia PDF Downloads 1351031 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management
Authors: Kenneth Harper
Abstract:
The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups
Procedia PDF Downloads 291030 Experimental Study of the Behavior of Elongated Non-spherical Particles in Wall-Bounded Turbulent Flows
Authors: Manuel Alejandro Taborda Ceballos, Martin Sommerfeld
Abstract:
Transport phenomena and dispersion of non-spherical particle in turbulent flows are found everywhere in industrial application and processes. Powder handling, pollution control, pneumatic transport, particle separation are just some examples where the particle encountered are not only spherical. These types of multiphase flows are wall bounded and mostly highly turbulent. The particles found in these processes are rarely spherical but may have various shapes (e.g., fibers, and rods). Although research related to the behavior of regular non-spherical particles in turbulent flows has been carried out for many years, it is still necessary to refine models, especially near walls where the interaction fiber-wall changes completely its behavior. Imaging-based experimental studies on dispersed particle-laden flows have been applied for many decades for a detailed experimental analysis. These techniques have the advantages that they provide field information in two or three dimensions, but have a lower temporal resolution compared to point-wise techniques such as PDA (phase-Doppler anemometry) and derivations therefrom. The applied imaging techniques in dispersed two-phase flows are extensions from classical PIV (particle image velocimetry) and PTV (particle tracking velocimetry) and the main emphasis was simultaneous measurement of the velocity fields of both phases. In a similar way, such data should also provide adequate information for validating the proposed models. Available experimental studies on the behavior of non-spherical particles are uncommon and mostly based on planar light-sheet measurements. Especially for elongated non-spherical particles, however, three-dimensional measurements are needed to fully describe their motion and to provide sufficient information for validation of numerical computations. For further providing detailed experimental results allowing a validation of numerical calculations of non-spherical particle dispersion in turbulent flows, a water channel test facility was built around a horizontal closed water channel. Into this horizontal main flow, a small cross-jet laden with fiber-like particles was injected, which was also solely driven by gravity. The dispersion of the fibers was measured by applying imaging techniques based on a LED array for backlighting and high-speed cameras. For obtaining the fluid velocity fields, almost neutrally buoyant tracer was used. The discrimination between tracer and fibers was done based on image size which was also the basis to determine fiber orientation with respect to the inertial coordinate system. The synchronous measurement of fluid velocity and fiber properties also allow the collection of statistics of fiber orientation, velocity fields of tracer and fibers, the angular velocity of the fibers and the orientation between fiber and instantaneous relative velocity. Consequently, an experimental study the behavior of elongated non-spherical particles in wall bounded turbulent flows was achieved. The development of a comprehensive analysis was succeeded, especially near the wall region, where exists hydrodynamic wall interaction effects (e.g., collision or lubrication) and abrupt changes of particle rotational velocity. This allowed us to predict numerically afterwards the behavior of non-spherical particles within the frame of the Euler/Lagrange approach, where the particles are therein treated as “point-particles”.Keywords: crossflow, non-spherical particles, particle tracking velocimetry, PIV
Procedia PDF Downloads 871029 Mitigating Denial of Service Attacks in Information Centric Networking
Authors: Bander Alzahrani
Abstract:
Information-centric networking (ICN) using architectures such as Publish-Subscribe Internet Routing Paradigm (PSIRP) is one of the promising candidates for a future Internet, has recently been under the spotlight by the research community to investigate the possibility of redesigning the current Internet architecture to solve many issues such as routing scalability, security, and quality of services issues.. The Bloom filter-based forwarding is a source-routing approach that is used in the PSIRP architecture. This mechanism is vulnerable to brute force attacks which may lead to denial-of-service (DoS) attacks. In this work, we present a new forwarding approach that keeps the advantages of Bloom filter-based forwarding while mitigates attacks on the forwarding mechanism. In practice, we introduce a special type of forwarding nodes called Edge-FW to be placed at the edge of the network. The role of these node is to add an extra security layer by validating and inspecting packets at the edge of the network against brute-force attacks and check whether the packet contains a legitimate forwarding identifier (FId) or not. We leverage Certificateless Aggregate Signature (CLAS) scheme with a small size of 64-bit which is used to sign the FId. Hence, this signature becomes bound to a specific FId. Therefore, malicious nodes that inject packets with random FIds will be easily detected and dropped at the Edge-FW node when the signature verification fails. Our preliminary security analysis suggests that with the proposed approach, the forwarding plane is able to resist attacks such as DoS with very high probability.Keywords: bloom filter, certificateless aggregate signature, denial-of-service, information centric network
Procedia PDF Downloads 1981028 Enhancing Organizational Performance through Adaptive Learning: A Case Study of ASML
Authors: Ramin Shadani
Abstract:
This study introduces adaptive performance as a key organizational performance dimension and explores the relationship between the dimensions of a learning organization and adaptive performance. A survey was therefore conducted using the dimensions of the Learning Organization Questionnaire (DLOQ), followed by factor analysis and structural equation modeling in order to investigate the dynamics between learning organization practices and adaptive performance. Results confirm that adaptive performance is indeed one important dimension of organizational performance. The study also shows that perceived knowledge and adaptive performance mediate the positive relationship between the practices of a learning organization with perceived financial performance. We extend existing DLOQ research by demonstrating that adaptive performance, as a nonfinancial organizational learning outcome, has a significant impact on financial performance. Our study also provides additional validation of the measures of DLOQ's performance. Indeed, organizations need to take a glance at how the activities of learning and development can provide better overall improvement in performance, especially in enhancing adaptive capability. The study has provided requisite empirical support that activities of learning and development within organizations allow much-improved intangible performance outcomes, especially through adaptive performance.Keywords: adaptive performance, continuous learning, financial performance, leadership style, organizational learning, organizational performance
Procedia PDF Downloads 341027 Performance Analysis on the Smoke Management System of the Weiwuying Center for the Arts Using Hot Smoke Tests
Authors: K. H. Yang, T. C. Yeh, P. S. Lu, F. C. Yang, T. Y. Wu, W. J. Sung
Abstract:
In this study, a series of full-scale hot smoke tests has been conducted to validate the performances of the smoke management system in the WWY center for arts before grand opening. Totaled 19 scenarios has been established and experimented with fire sizes ranging from 2 MW to 10 MW. The measured ASET data provided by the smoke management system experimentation were compared with the computer-simulated RSET values for egress during the design phase. The experimental result indicated that this system could successfully provide a safety margin of 200% and ensure a safe evacuation in case of fire in the WWY project, including worst-cases and fail-safe scenarios. The methodology developed and results obtained in this project can provide a useful reference for future applications, such as for the large-scale indoor sports dome and arena, stadium, shopping malls, airport terminals, and stations or tunnels for railway and subway systems.Keywords: building hot smoke tests, performance-based smoke management system designs, full-scale experimental validation, tenable condition criteria
Procedia PDF Downloads 4461026 A Statistical Approach to Rationalise the Number of Working Load Test for Quality Control of Pile Installation in Singapore Jurong Formation
Authors: Nuo Xu, Kok Hun Goh, Jeyatharan Kumarasamy
Abstract:
Pile load testing is significant during foundation construction due to its traditional role of design validation and routine quality control of the piling works. In order to verify whether piles can take loadings at specified settlements, piles will have to undergo working load test where the test load should normally up to 150% of the working load of a pile. Selection or sampling of piles for the working load test is done subject to the number specified in Singapore National Annex to Eurocode 7 SS EN 1997-1:2010. This paper presents an innovative way to rationalize the number of pile load test by adopting statistical analysis approach and looking at the coefficient of variance of pile elastic modulus using a case study at Singapore Tuas depot. Results are very promising and have shown that it is possible to reduce the number of working load test without influencing the reliability and confidence on the pile quality. Moving forward, it is suggested that more load test data from other geological formations to be examined to compare with the findings from this paper.Keywords: elastic modulus of pile under soil interaction, jurong formation, kentledge test, pile load test
Procedia PDF Downloads 3861025 Performance Investigation of Unmanned Aerial Vehicles Attitude Control Based on Modified PI-D and Nonlinear Dynamic Inversion
Authors: Ebrahim H. Kapeel, Ahmed M. Kamel, Hossam Hendy, Yehia Z. Elhalwagy
Abstract:
Interest in autopilot design has been raised intensely as a result of recent advancements in Unmanned Aerial vehicles (UAVs). Due to the enormous number of applications that UAVs can achieve, the number of applied control theories used for them has increased in recent years. These small fixed-wing UAVs are suffering high non-linearity, sensitivity to disturbances, and coupling effects between their channels. In this work, the nonlinear dynamic inversion (NDI) control law is designed for a nonlinear small fixed-wing UAV model. The NDI is preferable for varied operating conditions, there is no need for a scheduling controller. Moreover, it’s applicable for high angles of attack. For the designed flight controller validation, a nonlinear Modified PI-D controller is performed with our model. A comparative study between both controllers is achieved to evaluate the NDI performance. Simulation results and analysis are proposed to illustrate the effectiveness of the designed controller based on NDI.Keywords: attitude control, nonlinear PID, dynamic inversion
Procedia PDF Downloads 1121024 Optimal Scheduling of Trains in Complex National Scale Railway Networks
Authors: Sanat Ramesh, Tarun Dutt, Abhilasha Aswal, Anushka Chandrababu, G. N. Srinivasa Prasanna
Abstract:
Optimal Schedule Generation for a large national railway network operating thousands of passenger trains with tens of thousands of kilometers of track is a grand computational challenge in itself. We present heuristics based on a Mixed Integer Program (MIP) formulation for local optimization. These methods provide flexibility in scheduling new trains with varying speed and delays and improve utilization of infrastructure. We propose methods that provide a robust solution with hundreds of trains being scheduled over a portion of the railway network without significant increases in delay. We also provide techniques to validate the nominal schedules thus generated over global correlated variations in travel times thereby enabling us to detect conflicts arising due to delays. Our validation results which assume only the support of the arrival and departure time distributions takes an order of few minutes for a portion of the network and is computationally efficient to handle the entire network.Keywords: mixed integer programming, optimization, railway network, train scheduling
Procedia PDF Downloads 1591023 Software Engineering Inspired Cost Estimation for Process Modelling
Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller
Abstract:
Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development rocess and the process of process modelling which is a phase of the Business Process Management life-cycle.Keywords: COCOMO II, busines process modeling, cost estimation method, BPM COCOMO
Procedia PDF Downloads 4411022 Development and Metrological Validation of a Control Strategy in Embedded Island Grids Using Battery-Hybrid-Systems
Authors: L. Wilkening, G. Ackermann, T. T. Do
Abstract:
This article presents an approach for stand-alone and grid-connected mode of a German low-voltage grid with high share of photovoltaic. For this purpose, suitable dynamic system models have been developed. This allows the simulation of dynamic events in very small time ranges and the operation management over longer periods of time. Using these simulations, suitable control parameters could be identified, and their effects on the grid can be analyzed. In order to validate the simulation results, a LV-grid test bench has been implemented at the University of Technology Hamburg. The developed control strategies are to be validated using real inverters, generators and different realistic loads. It is shown that a battery hybrid system installed next to a voltage transformer makes it possible to operate the LV-grid in stand-alone mode without using additional information and communication technology and without intervention in the existing grid units. By simulating critical days of the year, suitable control parameters for stable stand-alone operations are determined and set point specifications for different control strategies are defined.Keywords: battery, e-mobility, photovoltaic, smart grid
Procedia PDF Downloads 1431021 Right Solution of Geodesic Equation in Schwarzschild Metric and Overall Examination of Physical Laws
Authors: Kwan U. Kim, Jin Sim, Ryong Jin Jang, Sung Duk Kim
Abstract:
108 years have passed since a great number of physicists explained astronomical and physical phenomena by solving geodesic equations in the Schwarzschild metric. However, when solving the geodesic equations in Schwarzschild metric, they did not correctly solve one branch of the component of space among spatial and temporal components of four-dimensional force and did not come up with physical laws correctly by means of physical analysis from the results obtained by solving the geodesic equations. In addition, they did not treat the astronomical and physical phenomena in a physical way based on the correct physical laws obtained from the solution of the geodesic equations in the Schwarzschild metric. Therefore, some former scholars mentioned that Einstein’s theoretical basis of a general theory of relativity was obscure and incorrect, but they did not give a correct physical solution to the problems. Furthermore, since the general theory of relativity has not given a quantitative solution to obscure and incorrect problems, the generalization of gravitational theory has not yet been successfully completed, although former scholars have thought of it and tried to do it. In order to solve the problems, it is necessary to explore the obscure and incorrect problems in a general theory of relativity based on the physical laws and to find out the methodology for solving the problems. Therefore, as the first step toward achieving this purpose, the right solution of the geodesic equation in the Schwarzschild metric has been presented. Next, the correct physical laws found by making a physical analysis of the results have been presented, the obscure and incorrect problems have been shown, and an analysis of them has been made based on the physical laws. In addition, the experimental verification of the physical laws found by us has been made.Keywords: equivalence principle, general relativity, geometrodynamics, Schwarzschild, Poincaré
Procedia PDF Downloads 161020 Analytical Modeling of Drain Current for DNA Biomolecule Detection in Double-Gate Tunnel Field-Effect Transistor Biosensor
Authors: Ashwani Kumar
Abstract:
Abstract- This study presents an analytical modeling approach for analyzing the drain current behavior in Tunnel Field-Effect Transistor (TFET) biosensors used for the detection of DNA biomolecules. The proposed model focuses on elucidating the relationship between the drain current and the presence of DNA biomolecules, taking into account the impact of various device parameters and biomolecule characteristics. Through comprehensive analysis, the model offers insights into the underlying mechanisms governing the sensing performance of TFET biosensors, aiding in the optimization of device design and operation. A non-local tunneling model is incorporated with other essential models to accurately trace the simulation and modeled data. An experimental validation of the model is provided, demonstrating its efficacy in accurately predicting the drain current response to DNA biomolecule detection. The sensitivity attained from the analytical model is compared and contrasted with the ongoing research work in this area.Keywords: biosensor, double-gate TFET, DNA detection, drain current modeling, sensitivity
Procedia PDF Downloads 581019 HEXAFLY-INT Project: Design of a High Speed Flight Experiment
Authors: S. Di Benedetto, M. P. Di Donato, A. Rispoli, S. Cardone, J. Riehmer, J. Steelant, L. Vecchione
Abstract:
Thanks to a coordinated funding by the European Space Agency (ESA) and the European Commission (EC) within the 7th framework program, the High-Speed Experimental Fly Vehicles – International (HEXAFLY-INT) project is aimed at the flight validation of hypersonics technologies enabling future trans-atmospheric flights. The project, which is currently involving partners from Europe, Russian Federation and Australia operating under ESA/ESTEC coordination, will achieve the goal of designing, manufacturing, assembling and flight testing an unpowered high speed vehicle in a glider configuration by 2018. The main technical challenges of the project are specifically related to the design of the vehicle gliding configuration and to the complexity of integrating breakthrough technologies with standard aeronautical technologies, e.g. high temperature protection system and airframe cold structures. Also, the sonic boom impact, which is one of the environmental challenges of the high speed flight, will be assessed. This paper provides a comprehensive and detailed update on all the current projects activities carried out to date on both the vehicle and mission design.Keywords: design, flight testing, HEXAFLY-INT, hypersonics
Procedia PDF Downloads 4691018 Estimation of Solar Radiation Power Using Reference Evaluation of Solar Transmittance, 2 Bands Model: Case Study of Semarang, Central Java, Indonesia
Authors: Benedictus Asriparusa
Abstract:
Solar radiation is a green renewable energy which has the potential to answer the needs of energy problems on the period. Knowing how to estimate the strength of the solar radiation force may be one solution of sustainable energy development in an integrated manner. Unfortunately, a fairly extensive area of Indonesia is still very low availability of solar radiation data. Therefore, we need a method to estimate the exact strength of solar radiation. In this study, author used a model Reference Evaluation of Solar Transmittance, 2 Bands (REST 2). Validation of REST 2 model has been performed in Spain, India, Colorado, Saudi Arabia, and several other areas. But it is not widely used in Indonesia. Indonesian region study area is represented by the area of Semarang, Central Java. Solar radiation values estimated using REST 2 model was then verified by field data and gives average RMSE value of 6.53%. Based on the value, it can be concluded that the model REST 2 can be used to estimate the value of solar radiation in clear sky conditions in parts of Indonesia.Keywords: estimation, solar radiation power, REST 2, solar transmittance
Procedia PDF Downloads 4281017 Self-denigration in Doctoral Defense Sessions: Scale Development and Validation
Authors: Alireza Jalilifar, Nadia Mayahi
Abstract:
The dissertation defense as a complicated conflict-prone context entails the adoption of elegant interactional strategies, one of which is self-denigration. This study aimed to develop and validate a self-denigration model that fits the context of doctoral defense sessions in applied linguistics. Two focus group discussions provided the basis for developing this conceptual model, which assumed 10 functions for self-denigration, namely good manners, modesty, affability, altruism, assertiveness, diffidence, coercive self-deprecation, evasion, diplomacy, and flamboyance. These functions were used to design a 40-item questionnaire on the attitudes of applied linguists concerning self-denigration in defense sessions. The confirmatory factor analysis of the questionnaire indicated the predictive ability of the measurement model. The findings of this study suggest that self-denigration in doctoral defense sessions is the social representation of the participants’ values, ideas and practices adopted as a negotiation strategy and a conflict management policy for the purpose of establishing harmony and maintaining resilience. This study has implications for doctoral students and academics and illuminates further research on self-denigration in other contexts.Keywords: academic discourse, politeness, self-denigration, grounded theory, dissertation defense
Procedia PDF Downloads 139