Search results for: software component and interfaces
7073 Application of FT-NIR Spectroscopy and Electronic Nose in On-line Monitoring of Dough Proofing
Authors: Madhuresh Dwivedi, Navneet Singh Deora, Aastha Deswal, H. N. Mishra
Abstract:
FT-NIR spectroscopy and electronic nose was used to study the kinetics of dough proofing. Spectroscopy was conducted with an optic probe in the diffuse reflectance mode. The dough leavening was carried out at different temperatures (25 and 35°C) and constant RH (80%). Spectra were collected in the range of wave numbers from 12,000 to 4,000 cm-1 directly on the samples, every 5 min during proofing, up to 2 hours. NIR spectra were corrected for scatter effect and second order derivatization was done to transform the spectra. Principal component analysis (PCA) was applied for the leavening process and process kinetics was calculated. PCA was performed on data set and loadings were calculated. For leavening, four absorption zones (8,950-8,850, 7,200-6,800, 5,250-5,150 and 4,700-4,250 cm-1) were involved in describing the process. Simultaneously electronic nose was also used for understanding the development of odour compounds during fermentation. The electronic nose was able to differential the sample on the basis of aroma generation at different time during fermentation. In order to rapidly differentiate samples based on odor, a Principal component analysis is performed and successfully demonstrated in this study. The result suggests that electronic nose and FT-NIR spectroscopy can be utilized for the online quality control of the fermentation process during leavening of bread dough.Keywords: FT-NIR, dough, e-nose, proofing, principal component analysis
Procedia PDF Downloads 3927072 Effect of Fiddler Crab Burrows on Bacterial Communities of Mangrove Sediments
Authors: Mohammad Mokhtari, Gires Usup, Zaidi Che Cob
Abstract:
Bacteria communities as mediators of the biogeochemical process are the main component of the mangrove ecosystems. Crab burrows by increasing oxic-anoxic interfaces and facilitating the flux rate between sediment and tidal water affect biogeochemical properties of sediments. The effect of fiddler crab burrows on the density and diversity of bacteria were investigated to elucidate the effect of burrow on bacterial distribution. Samples collected from the burrow walls of three species of fiddler crabs including Uca paradussumieri, Uca rosea, and Uca forcipata. Sediment properties including grain size, temperature, Redox potential, pH, chlorophyll, water and organic content were measured from the burrow walls to assess the correlation between environmental variables and bacterial communities. Bacteria were enumerated with epifluorescence microscopy after staining with SYBR green. Bacterial DNA extracted from sediment samples and the community profiles of bacteria were determined with Terminal Restriction Fragment Length Polymorphism (T-RFLP). High endemism was observed among bacterial communities. Among the 152 observed OTU’s, 22 were found only in crab burrows. The highest bacterial density and diversity were recorded in burrow wall. The results of ANOSIM indicated a significant difference between the bacterial communities from the three species of fiddler crab burrows. Only 3% of explained bacteria variability in the constrained ordination model of CCA was contributed to depth, while much of the bacteria’s variability was attributed to coarse sand, pH, and chlorophyll content. Our findings suggest that crab burrows by affecting sediment properties such as redox potential, pH, water, and chlorophyll content induce significant effects on the bacterial communities.Keywords: bioturbation, canonical corresponding analysis, fiddler crab, microbial ecology
Procedia PDF Downloads 1577071 The Effect of Voice Recognition Dictation Software on Writing Quality in Third Grade Students: An Action Research Study
Authors: Timothy J. Grebec
Abstract:
This study investigated whether using a voice dictation software program (i.e., Google Voice Typing) has an impact on student writing quality. The research took place in a third-grade general education classroom in a suburban school setting. Because the study involved minors, all data was encrypted and deidentified before analysis. The students completed a series of writings prior to the beginning of the intervention to determine their thoughts and skill level with writing. During the intervention phase, the students were introduced to the voice dictation software, given an opportunity to practice using it, and then assigned writing prompts to be completed using the software. The prompts written by nineteen student participants and surveys of student opinions on writing established a baseline for the study. The data showed that using the dictation software resulted in a 34% increase in the response quality (compared to the Pennsylvania State Standardized Assessment [PSSA] writing guidelines). Of particular interest was the increase in students' proficiency in demonstrating mastery of the English language and conventions and elaborating on the content. Although this type of research is relatively no, it has the potential to reshape the strategies educators have at their disposal when instructing students on written language.Keywords: educational technology, accommodations, students with disabilities, writing instruction, 21st century education
Procedia PDF Downloads 757070 A Survey on Genetic Algorithm for Intrusion Detection System
Authors: Prikhil Agrawal, N. Priyanka
Abstract:
With the increase of millions of users on Internet day by day, it is very essential to maintain highly reliable and secured data communication between various corporations. Although there are various traditional security imparting techniques such as antivirus software, password protection, data encryption, biometrics and firewall etc. But still network security has become the main issue in various leading companies. So IDSs have become an essential component in terms of security, as it can detect various network attacks and respond quickly to such occurrences. IDSs are used to detect unauthorized access to a computer system. This paper describes various intrusion detection techniques using GA approach. The intrusion detection problem has become a challenging task due to the conception of miscellaneous computer networks under various vulnerabilities. Thus the damage caused to various organizations by malicious intrusions can be mitigated and even be deterred by using this powerful tool.Keywords: genetic algorithm (GA), intrusion detection system (IDS), dataset, network security
Procedia PDF Downloads 2997069 The Effect of Adhesion on the Frictional Hysteresis Loops at a Rough Interface
Authors: M. Bazrafshan, M. B. de Rooij, D. J. Schipper
Abstract:
Frictional hysteresis is the phenomenon in which mechanical contacts are subject to small (compared to contact area) oscillating tangential displacements. In the presence of adhesion at the interface, the contact repulsive force increases leading to a higher static friction force and pre-sliding displacement. This paper proposes a boundary element model (BEM) for the adhesive frictional hysteresis contact at the interface of two contacting bodies of arbitrary geometries. In this model, adhesion is represented by means of a Dugdale approximation of the total work of adhesion at local areas with a very small gap between the two bodies. The frictional contact is divided into sticking and slipping regions in order to take into account the transition from stick to slip (pre-sliding regime). In the pre-sliding regime, the stick and slip regions are defined based on the local values of shear stress and normal pressure. In the studied cases, a fixed normal force is applied to the interface and the friction force varies in such a way to start gross sliding in one direction reciprocally. For the first case, the problem is solved at the smooth interface between a ball and a flat for different values of work of adhesion. It is shown that as the work of adhesion increases, both static friction and pre-sliding distance increase due to the increase in the contact repulsive force. For the second case, the rough interface between a glass ball against a silicon wafer and a DLC (Diamond-Like Carbon) coating is considered. The work of adhesion is assumed to be identical for both interfaces. As adhesion depends on the interface roughness, the corresponding contact repulsive force is different for these interfaces. For the smoother interface, a larger contact repulsive force and consequently, a larger static friction force and pre-sliding distance are observed.Keywords: boundary element model, frictional hysteresis, adhesion, roughness, pre-sliding
Procedia PDF Downloads 1687068 Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator
Authors: Lívia B. Meirelles, Erika C. A. N. Chrisman, Flávia B. de Andrade, Lilian C. M. de Oliveira
Abstract:
True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).Keywords: distillation curve, petroleum distillation, simulation, true boiling point curve
Procedia PDF Downloads 4427067 Sensitivity of Credit Default Swaps Premium to Global Risk Factor: Evidence from Emerging Markets
Authors: Oguzhan Cepni, Doruk Kucuksarac, M. Hasan Yilmaz
Abstract:
Risk premium of emerging markets are moving altogether depending on the momentum and shifts in the global risk appetite. However, the magnitudes of these changes in the risk premium of emerging market economies might vary. In this paper, we focus on how global risk factor affects credit default swaps (CDS) premiums of emerging markets using principal component analysis (PCA) and rolling regressions. PCA results indicate that the first common component accounts for almost 76% of common variation in CDS premiums of emerging markets. Additionally, the explanatory power of the first factor seems to be high over sample period. However, the sensitivity to the global risk factor tends to change over time and across countries. In this regard, fixed effects panel regressions are employed to identify the macroeconomic factors driving the heterogeneity across emerging markets. There are two main macroeconomic variables that affect the sensitivity; government debt to GDP and international reserves to GDP. The countries with lower government debt and higher reserves tend to be less subject to the variations in the global risk appetite.Keywords: emerging markets, principal component analysis, credit default swaps, sovereign risk
Procedia PDF Downloads 3817066 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries
Authors: Abdulrahman M. Qahtani, Gary. B. Wills, Andy. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.Keywords: customisation software products, global software engineering, local decision making, requirement engineering, simulation model
Procedia PDF Downloads 4307065 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases
Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García
Abstract:
This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.Keywords: conceptual modelling, JSON, NoSQL databases, requirements engineering, software development
Procedia PDF Downloads 3797064 Measurement and Analysis of Radiation Doses to Radiosensitive Organs from CT Examination of the Cervical Spine Using Radiochromic Films and Monte Carlo Simulation Based Software
Authors: Khaled Soliman, Abdullah Alrushoud, Abdulrahman Alkhalifah, Raed Albathi, Salman Altymiat
Abstract:
Radiation dose received by patients undergoing Computed Tomography (CT) examination of the cervical spine was evaluated using Gafchromic XR-QA2 films and CT-EXPO software (ver. 2.3), in order to document our clinical dose values and to compare our results with other benchmarks reported in the current literature. Radiochromic films were recently used as practical dosimetry tool that provides dose profile information not available using the standard ionisation chamber routinely used in CT dosimetry. We have developed an in-house program to use the films in order to calculate the Entrance Dose Length Product (EDLP) in (mGy.cm) and to relate the EDLP to various organ doses calculated using the CT-EXPO software. We also calculated conversion factor in (mSv/mGy.cm) relating the EDLP to the effective dose (ED) from the examination using CT-EXPO software. Variability among different types of CT scanners and dose modulation methods are reported from at least three major CT brands available at our medical institution. Our work describes the dosimetry method and results are reported. The method can be used as in-vivo dosimetry method. But this work only reports results obtained from adult female anthropomorphic Phantom studies.Keywords: CT dosimetry, gafchromic films, XR-QA2, CT-Expo software
Procedia PDF Downloads 4717063 An Architectural Model of Multi-Agent Systems for Student Evaluation in Collaborative Game Software
Authors: Monica Hoeldtke Pietruchinski, Andrey Ricardo Pimentel
Abstract:
The teaching of computer programming for beginners has been presented to the community as a not simple or trivial task. Several methodologies and research tools have been developed; however, the problem still remains. This paper aims to present multi-agent system architecture to be incorporated to the educational collaborative game software for teaching programming that monitors, evaluates and encourages collaboration by the participants. A literature review has been made on the concepts of Collaborative Learning, Multi-agents systems, collaborative games and techniques to teach programming using these concepts simultaneously.Keywords: architecture of multi-agent systems, collaborative evaluation, collaboration assessment, gamifying educational software
Procedia PDF Downloads 4647062 Body Mass Components in Young Soccer Players
Authors: Elizabeta Sivevska, Sunchica Petrovska, Vaska Antevska, Lidija Todorovska, Sanja Manchevska, Beti Dejanova, Ivanka Karagjozova, Jasmina Pluncevic Gligoroska
Abstract:
Introduction: Body composition plays an important role in the selection of young soccer players and it is associated with their successful performance. The most commonly used model of body composition divides the body into two compartments: fat components and fat-free mass (muscular and bone components). The aims of the study were to determine the body composition parameters of young male soccer players and to show the differences in age groups. Material and methods: A sample of 52 young male soccer players, with an age span from 9 to 14 years were divided into two groups according to the age (group 1 aged 9 to 12 years and group 2 aged 12 to 14 years). Anthropometric measurements were taken according to the method of Mateigka. The following measurements were made: body weight, body height, circumferences (arm, forearm, thigh and calf), diameters (elbow, knee, wrist, ankle) and skinfold thickness (biceps, triceps, thigh, leg, chest, abdomen). The measurements were used in Mateigka’s equations. Results: Body mass components were analyzed as absolute values (in kilograms) and as percentage values: the muscular component (MC kg and MC%), the bone component (BCkg and BC%) and the body fat (BFkg and BF%). The group up to 12 years showed the following mean values of the analyzed parameters: MM=21.5kg; MM%=46.3%; BC=8.1kg; BC%=19.1%; BF= 6.3kg; BF%= 15.7%. The second group aged 12-14 year had mean values of body composition parameters as follows: MM=25.6 kg; MM%=48.2%; BC = 11.4 kg; BC%=21.6%; BF= 8.5 kg; BF%= 14. 7%. Conclusions: The young soccer players aged 12 up to 14 years who are in the pre-pubertal phase of growth and development had higher bone component (p<0.05) compared to younger players. There is no significant difference in muscular and fat body component between the two groups of young soccer players.Keywords: body composition, young soccer players, body fat, fat-free mass
Procedia PDF Downloads 4587061 Evaluation of SDS (Software Defined Storage) Controller (CorpHD) for Various Storage Demands
Authors: Shreya Bokare, Sanjay Pawar, Shika Nema
Abstract:
Growth in cloud applications is generating the tremendous amount of data, building load on traditional storage management systems. Software Defined Storage (SDS) is a new storage management concept becoming popular to handle this large amount of data. CoprHD is one of the open source SDS controller, available for experimentation and development in the storage industry. In this paper, the storage management techniques provided by CoprHD to manage heterogeneous storage platforms are experimented and analyzed. Various storage management parameters such as time to provision, storage capacity measurement, and heterogeneity are experimentally evaluated along with the theoretical expression to prove the completeness of CoprHD controller for storage management.Keywords: software defined storage, SDS, CoprHD, open source, SMI-S simulator, clarion, Symmetrix
Procedia PDF Downloads 3137060 Brain-Computer Interface System for Lower Extremity Rehabilitation of Chronic Stroke Patients
Authors: Marc Sebastián-Romagosa, Woosang Cho, Rupert Ortner, Christy Li, Christoph Guger
Abstract:
Neurorehabilitation based on Brain-Computer Interfaces (BCIs) shows important rehabilitation effects for patients after stroke. Previous studies have shown improvements for patients that are in a chronic stage and/or have severe hemiparesis and are particularly challenging for conventional rehabilitation techniques. For this publication, seven stroke patients in the chronic phase with hemiparesis in the lower extremity were recruited. All of them participated in 25 BCI sessions about 3 times a week. The BCI system was based on the Motor Imagery (MI) of the paretic ankle dorsiflexion and healthy wrist dorsiflexion with Functional Electrical Stimulation (FES) and avatar feedback. Assessments were conducted to assess the changes in motor improvement before, after and during the rehabilitation training. Our primary measures used for the assessment were the 10-meters walking test (10MWT), Range of Motion (ROM) of the ankle dorsiflexion and Timed Up and Go (TUG). Results show a significant increase in the gait speed in the primary measure 10MWT fast velocity of 0.18 m/s IQR = [0.12 to 0.2], P = 0.016. The speed in the TUG was also significantly increased by 0.1 m/s IQR = [0.09 to 0.11], P = 0.031. The active ROM assessment increased 4.65º, and IQR = [ 1.67 - 7.4], after rehabilitation training, P = 0.029. These functional improvements persisted at least one month after the end of the therapy. These outcomes show the feasibility of this BCI approach for chronic stroke patients and further support the growing consensus that these types of tools might develop into a new paradigm for rehabilitation tools for stroke patients. However, the results are from only seven chronic stroke patients, so the authors believe that this approach should be further validated in broader randomized controlled studies involving more patients. MI and FES-based non-invasive BCIs are showing improvement in the gait rehabilitation of patients in the chronic stage after stroke. This could have an impact on the rehabilitation techniques used for these patients, especially when they are severely impaired and their mobility is limited.Keywords: neuroscience, brain computer interfaces, rehabilitat, stroke
Procedia PDF Downloads 927059 Bug Localization on Single-Line Bugs of Apache Commons Math Library
Authors: Cherry Oo, Hnin Min Oo
Abstract:
Software bug localization is one of the most costly tasks in program repair technique. Therefore, there is a high claim for automated bug localization techniques that can monitor programmers to the locations of bugs, with slight human arbitration. Spectrum-based bug localization aims to help software developers to discover bugs rapidly by investigating abstractions of the program traces to make a ranking list of most possible buggy modules. Using the Apache Commons Math library project, we study the diagnostic accuracy using our spectrum-based bug localization metric. Our outcomes show that the greater performance of a specific similarity coefficient, used to inspect the program spectra, is mostly effective on localizing of single line bugs.Keywords: software testing, bug localization, program spectra, bug
Procedia PDF Downloads 1437058 Case Study Analysis of 2017 European Railway Traffic Management Incident: The Application of System for Investigation of Railway Interfaces Methodology
Authors: Sanjeev Kumar Appicharla
Abstract:
This paper presents the results of the modelling and analysis of the European Railway Traffic Management (ERTMS) safety-critical incident to raise awareness of biases in the systems engineering process on the Cambrian Railway in the UK using the RAIB 17/2019 as a primary input. The RAIB, the UK independent accident investigator, published the Report- RAIB 17/2019 giving the details of their investigation of the focal event in the form of immediate cause, causal factors, and underlying factors and recommendations to prevent a repeat of the safety-critical incident on the Cambrian Line. The Systems for Investigation of Railway Interfaces (SIRI) is the methodology used to model and analyze the safety-critical incident. The SIRI methodology uses the Swiss Cheese Model to model the incident and identify latent failure conditions (potentially less than adequate conditions) by means of the management oversight and risk tree technique. The benefits of the systems for investigation of railway interfaces methodology (SIRI) are threefold: first is that it incorporates the “Heuristics and Biases” approach advanced by 2002 Nobel laureate in Economic Sciences, Prof Daniel Kahneman, in the management oversight and risk tree technique to identify systematic errors. Civil engineering and programme management railway professionals are aware of the role “optimism bias” plays in programme cost overruns and are aware of bow tie (fault and event tree) model-based safety risk modelling techniques. However, the role of systematic errors due to “Heuristics and Biases” is not appreciated as yet. This overcomes the problems of omission of human and organizational factors from accident analysis. Second, the scope of the investigation includes all levels of the socio-technical system, including government, regulatory, railway safety bodies, duty holders, signaling firms and transport planners, and front-line staff such that lessons are learned at the decision making and implementation level as well. Third, the author’s past accident case studies are supplemented with research pieces of evidence drawn from the practitioner's and academic researchers’ publications as well. This is to discuss the role of system thinking to improve the decision-making and risk management processes and practices in the IEC 15288 systems engineering standard and in the industrial context such as the GB railways and artificial intelligence (AI) contexts as well.Keywords: accident analysis, AI algorithm internal audit, bounded rationality, Byzantine failures, heuristics and biases approach
Procedia PDF Downloads 1907057 Creation of Computerized Benchmarks to Facilitate Preparedness for Biological Events
Abstract:
Introduction: Communicable diseases and pandemics pose a growing threat to the well-being of the global population. A vital component of protecting the public health is the creation and sustenance of a continuous preparedness for such hazards. A joint Israeli-German task force was deployed in order to develop an advanced tool for self-evaluation of emergency preparedness for variable types of biological threats. Methods: Based on a comprehensive literature review and interviews with leading content experts, an evaluation tool was developed based on quantitative and qualitative parameters and indicators. A modified Delphi process was used to achieve consensus among over 225 experts from both Germany and Israel concerning items to be included in the evaluation tool. Validity and applicability of the tool for medical institutions was examined in a series of simulation and field exercises. Results: Over 115 German and Israeli experts reviewed and examined the proposed parameters as part of the modified Delphi cycles. A consensus of over 75% of experts was attained for 183 out of 188 items. The relative importance of each parameter was rated as part of the Delphi process, in order to define its impact on the overall emergency preparedness. The parameters were integrated in computerized web-based software that enables to calculate scores of emergency preparedness for biological events. Conclusions: The parameters developed in the joint German-Israeli project serve as benchmarks that delineate actions to be implemented in order to create and maintain an ongoing preparedness for biological events. The computerized evaluation tool enables to continuously monitor the level of readiness and thus strengths and gaps can be identified and corrected appropriately. Adoption of such a tool is recommended as an integral component of quality assurance of public health and safety.Keywords: biological events, emergency preparedness, bioterrorism, natural biological events
Procedia PDF Downloads 4257056 Performance Evaluation of Sand Casting Manufacturing Plant with WITNESS
Authors: Aniruddha Joshi
Abstract:
This paper discusses a simulation study of automated sand casting production system. Therefore, the first aims of this study is development of automated sand casting process model and analyze this model with a simulation software Witness. Production methodology aims to improve overall productivity through elimination of wastes and that leads to improve quality. Integration of automation with Simulation is beneficial to identify the obstacles in implementation and to take appropriate options to implement successfully. For this integration, there are different Simulation Software’s. To study this integration, with the help of “WITNESS” Simulation Software the model is created. This model is based on literature review. The input parameters are Setup Time, Number of machines, cycle time and output parameter is number of castings, avg, and time and percentage usage of machines. Obtained results are used for Statistical Analysis. This analysis concludes the optimal solution to get maximum output.Keywords: automated sand casting production system, simulation, WITNESS software, performance evaluation
Procedia PDF Downloads 7897055 Develop a Software to Hydraulic Redesign a Depropanizer Column to Minimize Energy Consumption
Authors: Mahdi Goharrokhi, Rasool Shiri, Eiraj Naser
Abstract:
A depropanizer column of a particular refinery was redesigned in this work. That is, minimum reflux ratio, minimum number of trays, feed tray location and the hydraulic characteristics of the tower were calculated and compared with the actual values of the existing tower. To Design review of the tower, fundamental equations were used to develop software which its results were compared with two commercial software results. In each case PR EOS was used. Based on the total energy consumption in reboiler and condenser, feed tray location was also determined using case study definition for tower.Keywords: column, hydraulic design, pressure drop, energy consumption
Procedia PDF Downloads 4247054 Software-Defined Radio Based Channel Measurement System of Wideband HF Communication System in Low-Latitude Region
Authors: P. H. Mukti, I. Kurniawati, F. Oktaviansyah, A. D. Adhitya, N. Rachmadani, R. Corputty, G. Hendrantoro, T. Fukusako
Abstract:
HF Communication system is one of the attractive fields among many researchers since it can be reached long-distance areas with low-cost. This long-distance communication can be achieved by exploiting the ionosphere as a transmission medium for the HF radio wave. However, due to the dynamic nature of ionosphere, the channel characteristic of HF communication has to be investigated in order to gives better performances. Many techniques to characterize HF channel are available in the literature. However, none of those techniques describe the HF channel characteristic in low-latitude regions, especially equatorial areas. Since the ionosphere around equatorial region has an ESF phenomenon, it becomes an important investigation to characterize the wideband HF Channel in low-latitude region. On the other sides, the appearance of software-defined radio attracts the interest of many researchers. Accordingly, in this paper a SDR-based channel measurement system is proposed to be used for characterizing the HF channel in low-latitude region.Keywords: channel characteristic, HF communication system, LabVIEW, software-defined radio, universal software radio peripheral
Procedia PDF Downloads 4897053 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider
Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón
Abstract:
The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.Keywords: AD0, ALICE, DCS, LHC
Procedia PDF Downloads 3067052 Implications of Learning Resource Centre in a Web Environment
Authors: Darshana Lal, Sonu Rana
Abstract:
Learning Resource Centers (LRC) are acquiring different kinds of documents like books, journals, thesis, dissertations, standard, databases etc. in print and e-form. This article deals with the different types of sources available in LRC. It also discusses the concept of the web, as a tool, as a multimedia system and the different interfaces available on the web. The reasons for establishing LRC are highlighted along with the assignments of LRC. Different features of LRC‘S like self-learning and group learning are described. It also implements a group of activities like reading, learning, educational etc. The use of LRC by students and faculties are given and concluded with the benefits.Keywords: internet, search engine, resource centre, opac, self-learning, group learning
Procedia PDF Downloads 3797051 Statistical Analysis with Prediction Models of User Satisfaction in Software Project Factors
Authors: Katawut Kaewbanjong
Abstract:
We analyzed a volume of data and found significant user satisfaction in software project factors. A statistical significance analysis (logistic regression) and collinearity analysis determined the significance factors from a group of 71 pre-defined factors from 191 software projects in ISBSG Release 12. The eight prediction models used for testing the prediction potential of these factors were Neural network, k-NN, Naïve Bayes, Random forest, Decision tree, Gradient boosted tree, linear regression and logistic regression prediction model. Fifteen pre-defined factors were truly significant in predicting user satisfaction, and they provided 82.71% prediction accuracy when used with a neural network prediction model. These factors were client-server, personnel changes, total defects delivered, project inactive time, industry sector, application type, development type, how methodology was acquired, development techniques, decision making process, intended market, size estimate approach, size estimate method, cost recording method, and effort estimate method. These findings may benefit software development managers considerably.Keywords: prediction model, statistical analysis, software project, user satisfaction factor
Procedia PDF Downloads 1247050 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema
Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy
Abstract:
Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet
Procedia PDF Downloads 3137049 Impact Analysis Based on Change Requirement Traceability in Object Oriented Software Systems
Authors: Sunil Tumkur Dakshinamurthy, Mamootil Zachariah Kurian
Abstract:
Change requirement traceability in object oriented software systems is one of the challenging areas in research. We know that the traces between links of different artifacts are to be automated or semi-automated in the software development life cycle (SDLC). The aim of this paper is discussing and implementing aspects of dynamically linking the artifacts such as requirements, high level design, code and test cases through the Extensible Markup Language (XML) or by dynamically generating Object Oriented (OO) metrics. Also, non-functional requirements (NFR) aspects such as stability, completeness, clarity, validity, feasibility and precision are discussed. We discuss this as a Fifth Taxonomy, which is a system vulnerability concern.Keywords: artifacts, NFRs, OO metrics, SDLC, XML
Procedia PDF Downloads 3427048 The Chewing Gum Confectionary Development for Oral Hygiene with Nine Hour Oral Antibacterial Activity
Authors: Yogesh Bacchaw, Ashish Dabade
Abstract:
Nowadays oral health is raising concern in society. Acid producing microorganisms changes the oral pH and creates a favorable environment for microbial growth. This growth not only promotes dental decay but also bad breath. Generally Recognized As Safe (GRAS) listed component was incorporated in chewing gum as an antimicrobial agent. The chewing gum produced exhibited up to 9 hours of antimicrobial activity against oral microflora. The toxicity of GRAS component per RACC value of chewing gum was negligible as compared to actual toxicity level of GRAS component. The antibacterial efficiency of chewing gum was tested by using total plate count (TPC) and colony forming unit (CFU). Nine hours were required to microflora to reach TPC/CFU of before chewing gum consumption. This chewing gum not only provides mouth freshening activity but also helps in lowering dental decay, bad breath, and enamel whitening.Keywords: colony forming unit (CFU), chewing gum, generally recognized as safe (GRAS), microbial growth, microorganisms, oral health, RACC, total plate count (TPC), antimicrobial agent, enamel whitening, oral pH
Procedia PDF Downloads 3167047 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data
Authors: Ruchika Malhotra, Megha Khanna
Abstract:
The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics
Procedia PDF Downloads 4187046 High Efficiency Double-Band Printed Rectenna Model for Energy Harvesting
Authors: Rakelane A. Mendes, Sandro T. M. Goncalves, Raphaella L. R. Silva
Abstract:
The concepts of energy harvesting and wireless energy transfer have been widely discussed in recent times. There are some ways to create autonomous systems for collecting ambient energy, such as solar, vibratory, thermal, electromagnetic, radiofrequency (RF), among others. In the case of the RF it is possible to collect up to 100 μW / cm². To collect and/or transfer energy in RF systems, a device called rectenna is used, which is defined by the junction of an antenna and a rectifier circuit. The rectenna presented in this work is resonant at the frequencies of 1.8 GHz and 2.45 GHz. Frequencies at 1.8 GHz band are e part of the GSM / LTE band. The GSM (Global System for Mobile Communication) is a frequency band of mobile telephony, it is also called second generation mobile networks (2G), it came to standardize mobile telephony in the world and was originally developed for voice traffic. LTE (Long Term Evolution) or fourth generation (4G) has emerged to meet the demand for wireless access to services such as Internet access, online games, VoIP and video conferencing. The 2.45 GHz frequency is part of the ISM (Instrumentation, Scientific and Medical) frequency band, this band is internationally reserved for industrial, scientific and medical development with no need for licensing, and its only restrictions are related to maximum power transfer and bandwidth, which must be kept within certain limits (in Brazil the bandwidth is 2.4 - 2.4835 GHz). The rectenna presented in this work was designed to present efficiency above 50% for an input power of -15 dBm. It is known that for wireless energy capture systems the signal power is very low and varies greatly, for this reason this ultra-low input power was chosen. The Rectenna was built using the low cost FR4 (Flame Resistant) substrate, the antenna selected is a microfita antenna, consisting of a Meandered dipole, and this one was optimized using the software CST Studio. This antenna has high efficiency, high gain and high directivity. Gain is the quality of an antenna in capturing more or less efficiently the signals transmitted by another antenna and/or station. Directivity is the quality that an antenna has to better capture energy in a certain direction. The rectifier circuit used has series topology and was optimized using Keysight's ADS software. The rectifier circuit is the most complex part of the rectenna, since it includes the diode, which is a non-linear component. The chosen diode is the Schottky diode SMS 7630, this presents low barrier voltage (between 135-240 mV) and a wider band compared to other types of diodes, and these attributes make it perfect for this type of application. In the rectifier circuit are also used inductor and capacitor, these are part of the input and output filters of the rectifier circuit. The inductor has the function of decreasing the dispersion effect on the efficiency of the rectifier circuit. The capacitor has the function of eliminating the AC component of the rectifier circuit and making the signal undulating.Keywords: dipole antenna, double-band, high efficiency, rectenna
Procedia PDF Downloads 1257045 Back to Basics: Redefining Quality Measurement for Hybrid Software Development Organizations
Authors: Satya Pradhan, Venky Nanniyur
Abstract:
As the software industry transitions from a license-based model to a subscription-based Software-as-a-Service (SaaS) model, many software development groups are using a hybrid development model that incorporates Agile and Waterfall methodologies in different parts of the organization. The traditional metrics used for measuring software quality in Waterfall or Agile paradigms do not apply to this new hybrid methodology. In addition, to respond to higher quality demands from customers and to gain a competitive advantage in the market, many companies are starting to prioritize quality as a strategic differentiator. As a result, quality metrics are included in the decision-making activities all the way up to the executive level, including board of director reviews. This paper presents key challenges associated with measuring software quality in organizations using the hybrid development model. We introduce a framework called Prevention-Inspection-Evaluation-Removal (PIER) to provide a comprehensive metric definition for hybrid organizations. The framework includes quality measurements, quality enforcement, and quality decision points at different organizational levels and project milestones. The metrics framework defined in this paper is being used for all Cisco systems products used in customer premises. We present several field metrics for one product portfolio (enterprise networking) to show the effectiveness of the proposed measurement system. As the results show, this metrics framework has significantly improved in-process defect management as well as field quality.Keywords: quality management system, quality metrics framework, quality metrics, agile, waterfall, hybrid development system
Procedia PDF Downloads 1767044 Digital Preservation: A Need of Tomorrow
Authors: Gaurav Kumar
Abstract:
Digital libraries have been established all over the world to create, maintain and to preserve the digital materials. This paper exhibits the importance and objectives of digital preservation. The necessities of preservation are hardware and software technology to interpret the digital documents and discuss various aspects of digital preservation.Keywords: preservation, digital preservation, conservation, archive, repository, document, information technology, hardware, software, organization, machine readable format
Procedia PDF Downloads 590