Search results for: GPS-X 5.1 software
3958 Smart in Performance: More to Practical Life than Hardware and Software
Authors: Faten Hatem
Abstract:
This paper promotes the importance of focusing on spatial aspects and affective factors that impact smart urbanism. This helps to better inform city governance, spatial planning, and policymaking to focus on what Smart does and what it can achieve for cities in terms of performance rather than on using the notion for prestige in a worldwide trend towards becoming a smart city. By illustrating how this style of practice compromises the social aspects and related elements of space making through an interdisciplinary comparative approach, the paper clarifies the impact of this compromise on the overall smart city performance. In response, this paper recognizes the importance of establishing a new meaning for urban progress by moving beyond improving basic services of the city to enhance the actual human experience which is essential for the development of authentic smart cities. The topic is presented under five overlooked areas that discuss the relation between smart cities’ potential and efficiency paradox, the social aspect, connectedness with nature, the human factor, and untapped resources. However, these themes are not meant to be discussed in silos, instead, they are presented to collectively examine smart cities in performance, arguing there is more to the practical life of smart cities than software and hardware inventions. The study is based on a case study approach, presenting Milton Keynes as a living example to learn from while engaging with various methods for data collection including multi-disciplinary semi-structured interviews, field observations, and data mining.Keywords: smart design, the human in the city, human needs and urban planning, sustainability, smart cities, smart
Procedia PDF Downloads 1033957 Evaluating Emission Reduction Due to a Proposed Light Rail Service: A Micro-Level Analysis
Authors: Saeid Eshghi, Neeraj Saxena, Abdulmajeed Alsultan
Abstract:
Carbon dioxide (CO2) alongside other gas emissions in the atmosphere cause a greenhouse effect, resulting in an increase of the average temperature of the planet. Transportation vehicles are among the main contributors of CO2 emission. Stationary vehicles with initiated motors produce more emissions than mobile ones. Intersections with traffic lights that force the vehicles to become stationary for a period of time produce more CO2 pollution than other parts of the road. This paper focuses on analyzing the CO2 produced by the traffic flow at Anzac Parade Road - Barker Street intersection in Sydney, Australia, before and after the implementation of Light rail transport (LRT). The data are gathered during the construction phase of the LRT by collecting the number of vehicles on each path of the intersection for 15 minutes during the evening rush hour of 1 week (6-7 pm, July 04-31, 2018) and then multiplied by 4 to calculate the flow of vehicles in 1 hour. For analyzing the data, the microscopic simulation software “VISSIM” has been used. Through the analysis, the traffic flow was processed in three stages: before and after implementation of light rail train, and one during the construction phase. Finally, the traffic results were input into another software called “EnViVer”, to calculate the amount of CO2 during 1 h. The results showed that after the implementation of the light rail, CO2 will drop by a minimum of 13%. This finding provides an evidence that light rail is a sustainable mode of transport.Keywords: carbon dioxide, emission modeling, light rail, microscopic model, traffic flow
Procedia PDF Downloads 1433956 Importance of Developing a Decision Support System for Diagnosis of Glaucoma
Authors: Murat Durucu
Abstract:
Glaucoma is a condition of irreversible blindness, early diagnosis and appropriate interventions to make the patients able to see longer time. In this study, it addressed that the importance of developing a decision support system for glaucoma diagnosis. Glaucoma occurs when pressure happens around the eyes it causes some damage to the optic nerves and deterioration of vision. There are different levels ranging blindness of glaucoma disease. The diagnosis at an early stage allows a chance for therapies that slows the progression of the disease. In recent years, imaging technology from Heidelberg Retinal Tomography (HRT), Stereoscopic Disc Photo (SDP) and Optical Coherence Tomography (OCT) have been used for the diagnosis of glaucoma. This better accuracy and faster imaging techniques in response technique of OCT have become the most common method used by experts. Although OCT images or HRT precision and quickness, especially in the early stages, there are still difficulties and mistakes are occurred in diagnosis of glaucoma. It is difficult to obtain objective results on diagnosis and placement process of the doctor's. It seems very important to develop an objective decision support system for diagnosis and level the glaucoma disease for patients. By using OCT images and pattern recognition systems, it is possible to develop a support system for doctors to make their decisions on glaucoma. Thus, in this recent study, we develop an evaluation and support system to the usage of doctors. Pattern recognition system based computer software would help the doctors to make an objective evaluation for their patients. It is intended that after development and evaluation processes of the software, the system is planning to be serve for the usage of doctors in different hospitals.Keywords: decision support system, glaucoma, image processing, pattern recognition
Procedia PDF Downloads 3023955 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.Keywords: floods, FLIKE, probability distributions, flood frequency, outlier
Procedia PDF Downloads 4503954 Insulin Receptor Substrate-1 (IRS1) and Transcription Factor 7-Like 2 (TCF7L2) Gene Polymorphisms Associated with Type 2 Diabetes Mellitus in Eritreans
Authors: Mengistu G. Woldu, Hani Y. Zaki, Areeg Faggad, Badreldin E. Abdalla
Abstract:
Background: Type 2 diabetes mellitus (T2DM) is a complex, degenerative, and multi-factorial disease, which is culpable for huge mortality and morbidity worldwide. Even though relatively significant numbers of studies are conducted on the genetics domain of this disease in the developed world, there is huge information gap in the sub-Saharan Africa region in general and in Eritrea in particular. Objective: The principal aim of this study was to investigate the association of common variants of the Insulin Receptor Substrate 1 (IRS1) and Transcription Factor 7-Like 2 (TCF7L2) genes with T2DM in the Eritrean population. Method: In this cross-sectional case control study 200 T2DM patients and 112 non-diabetes subjects were participated and genotyping of the IRS1 (rs13431179, rs16822615, 16822644rs, rs1801123) and TCF7L2 (rs7092484) tag SNPs were carries out using PCR-RFLP method of analysis. Haplotype analyses were carried out using Plink version 1.07, and Haploview 4.2 software. Linkage disequilibrium (LD), and Hardy-Weinberg equilibrium (HWE) analyses were performed using the Plink software. All descriptive statistical data analyses were carried out using SPSS (Version-20) software. Throughout the analysis p-value ≤0.05 was considered statistically significant. Result: Significant association was found between rs13431179 SNP of the IRS1 gene and T2DM under the recessive model of inheritance (OR=9.00, 95%CI=1.17-69.07, p=0.035), and marginally significant association found in the genotypic model (OR=7.50, 95%CI=0.94-60.06, p=0.058). The rs7092484 SNP of the TCF7L2 gene also showed markedly significant association with T2DM in the recessive (OR=3.61, 95%CI=1.70-7.67, p=0.001); and allelic (OR=1.80, 95%CI=1.23-2.62, p=0.002) models. Moreover, eight haplotypes of the IRS1 gene found to have significant association withT2DM (p=0.013 to 0.049). Assessments made on the interactions of genotypes of the rs13431179 and rs7092484 SNPs with various parameters demonstrated that high density lipoprotein (HDL), low density lipoprotein (LDL), waist circumference (WC), and systolic blood pressure (SBP) are the best T2DM onset predicting models. Furthermore, genotypes of the rs7092484 SNP showed significant association with various atherogenic indexes (Atherogenic index of plasma, LDL/HDL, and CHLO/HDL); and Eritreans carrying the GG or GA genotypes were predicted to be more susceptible to cardiovascular diseases onset. Conclusions: Results of this study suggest that IRS1 (rs13431179) and TCF7L2 (rs7092484) gene polymorphisms are associated with increased risk of T2DM in Eritreans.Keywords: IRS1, SNP, TCF7L2, type 2 diabetes
Procedia PDF Downloads 2253953 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools
Abstract:
Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.Keywords: block matching, digital evidence, hash list, evaluation of digital evidence
Procedia PDF Downloads 2553952 A Mathematical Model for Studying Landing Dynamics of a Typical Lunar Soft Lander
Authors: Johns Paul, Santhosh J. Nalluveettil, P. Purushothaman, M. Premdas
Abstract:
Lunar landing is one of the most critical phases of lunar mission. The lander is provided with a soft landing system to prevent structural damage of lunar module by absorbing the landing shock and also assure stability during landing. Presently available software are not capable to simulate the rigid body dynamics coupled with contact simulation and elastic/plastic deformation analysis. Hence a separate mathematical model has been generated for studying the dynamics of a typical lunar soft lander. Parameters used in the analysis includes lunar surface slope, coefficient of friction, initial touchdown velocity (vertical and horizontal), mass and moment of inertia of lander, crushing force due to energy absorbing material in the legs, number of legs and geometry of lander. The mathematical model is capable to simulate plastic and elastic deformation of honey comb, frictional force between landing leg and lunar soil, surface contact simulation, lunar gravitational force, rigid body dynamics and linkage dynamics of inverted tripod landing gear. The non linear differential equations generated for studying the dynamics of lunar lander is solved by numerical method. Matlab programme has been used as a computer tool for solving the numerical equations. The position of each kinematic joint is defined by mathematical equations for the generation of equation of motion. All hinged locations are defined by position vectors with respect to body fixed coordinate. The vehicle rigid body rotations and motions about body coordinate are only due to the external forces and moments arise from footpad reaction force due to impact, footpad frictional force and weight of vehicle. All these force are mathematically simulated for the generation of equation of motion. The validation of mathematical model is done by two different phases. First phase is the validation of plastic deformation of crushable elements by employing conservation of energy principle. The second phase is the validation of rigid body dynamics of model by simulating a lander model in ADAMS software after replacing the crushable elements to elastic spring element. Simulation of plastic deformation along with rigid body dynamics and contact force cannot be modeled in ADAMS. Hence plastic element of primary strut is replaced with a spring element and analysis is carried out in ADAMS software. The same analysis is also carried out using the mathematical model where the simulation of honeycomb crushing is replaced by elastic spring deformation and compared the results with ADAMS analysis. The rotational motion of linkages and 6 degree of freedom motion of lunar Lander about its CG can be validated by ADAMS software by replacing crushing element to spring element. The model is also validated by the drop test results of 4 leg lunar lander. This paper presents the details of mathematical model generated and its validation.Keywords: honeycomb, landing leg tripod, lunar lander, primary link, secondary link
Procedia PDF Downloads 3513951 Study of the Persian Gulf’s and Oman Sea’s Numerical Tidal Currents
Authors: Fatemeh Sadat Sharifi
Abstract:
In this research, a barotropic model was employed to consider the tidal studies in the Persian Gulf and Oman Sea, where the only sufficient force was the tidal force. To do that, a finite-difference, free-surface model called Regional Ocean Modeling System (ROMS), was employed on the data over the Persian Gulf and Oman Sea. To analyze flow patterns of the region, the results of limited size model of The Finite Volume Community Ocean Model (FVCOM) were appropriated. The two points were determined since both are one of the most critical water body in case of the economy, biology, fishery, Shipping, navigation, and petroleum extraction. The OSU Tidal Prediction Software (OTPS) tide and observation data validated the modeled result. Next, tidal elevation and speed, and tidal analysis were interpreted. Preliminary results determine a significant accuracy in the tidal height compared with observation and OTPS data, declaring that tidal currents are highest in Hormuz Strait and the narrow and shallow region between Iranian coasts and Islands. Furthermore, tidal analysis clarifies that the M_2 component has the most significant value. Finally, the Persian Gulf tidal currents are divided into two branches: the first branch converts from south to Qatar and via United Arab Emirate rotates to Hormuz Strait. The secondary branch, in north and west, extends up to the highest point in the Persian Gulf and in the head of Gulf turns counterclockwise.Keywords: numerical model, barotropic tide, tidal currents, OSU tidal prediction software, OTPS
Procedia PDF Downloads 1313950 Hydrodynamic Modeling of the Hydraulic Threshold El Haouareb
Authors: Sebai Amal, Massuel Sylvain
Abstract:
Groundwater is the key element of the development of most of the semi-arid areas where water resources are increasingly scarce due to an irregularity of precipitation, on the one hand, and an increasing demand on the other hand. This is the case of the watershed of the Central Tunisia Merguellil, object of the present study, which focuses on an implementation of an underground flows hydrodynamic model to understand the recharge processes of the Kairouan’s plain groundwater by aquifers boundary through the hydraulic threshold of El Haouareb. The construction of a conceptual geological 3D model by the Hydro GeoBuilder software has led to a definition of the aquifers geometry in the studied area thanks to the data acquired by the analysis of geologic sections of drilling and piezometers crossed shells partially or in full. Overall analyses of the piezometric Chronicles of different piezometers located at the level of the dam indicate that the influence of the dam is felt especially in the aquifer carbonate which confirms that the dynamics of this aquifer are highly correlated to the dam’s dynamic. Groundwater maps, high and low-water dam, show a flow that moves towards the threshold of El Haouareb to the discharge of the waters of Ain El Beidha discharge towards the plain of Kairouan. Software FEFLOW 5.2 steady hydrodynamic modeling to simulate the hydraulic threshold at the level of the dam El Haouareb in a satisfactory manner. However, the sensitivity study to the different parameters shows equivalence problems and a fix to calibrate the limestones’ permeability. This work could be improved by refining the timing steady and amending the representation of limestones in the model.Keywords: Hydrodynamic modeling, lithological modeling, hydraulic, semi-arid, merguellil, central Tunisia
Procedia PDF Downloads 7643949 Analysis of Iran-Turkey Relations Based on Environmental Geopolitics
Authors: Farid Abbasi
Abstract:
Geographical spaces have different relations with each other, and especially neighboring geographical spaces have more relations than other spaces due to their proximity. Meanwhile, various parameters affect the relationships between these spaces, such as environmental parameters. These parameters have become important in recent decades, affecting the political relations of the actors in neighboring spaces. The Islamic Republic of Iran and the Republic of Turkey, as two actors in the region, political relations seem to have been affected to some extent by environmental issues. Based on this, the present study tries to examine and analyze the political relations between the two countries from an environmental, and geopolitical perspective. The method of this research is descriptive-analytical. The method of data analysis is based on library and field information (questionnaire) in the form of content analysis and statistics through the Mick Mac software system and Scenario Wizard. The results of studies and analysis of theories show that 35 indicators, directly and indirectly, affect Iran-Turkey relations from an environmental, and geopolitical perspective, which are in the form of five dimensions (water resources, soil resources, Vegetation, climate, living species). Using the Mick Mac method, 9 factors were extracted as key factors affecting Iran-Turkey relations, and in the process of analyzing research scenarios, 10100 possible situations were presented by scenario wizard software. 9 strong scenarios with 3 scenarios of favorable and very favorable situations, 3 scenarios with moderate situations and also 3 scenarios with critical situations and catastrophes according to Iran-Turkey relations from the environmental aspect are presented.Keywords: geopolitics, relations, Iran, Turkey, environment
Procedia PDF Downloads 1503948 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations
Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar
Abstract:
Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.Keywords: cache behaviour, network-on-chip, performance profiling, vectorization
Procedia PDF Downloads 1983947 Theoretical Modal Analysis of Freely and Simply Supported RC Slabs
Authors: M. S. Ahmed, F. A. Mohammad
Abstract:
This paper focuses on the dynamic behavior of reinforced concrete (RC) slabs. Therefore, the theoretical modal analysis was performed using two different types of boundary conditions. Modal analysis method is the most important dynamic analyses. The analysis would be modal case when there is no external force on the structure. By using this method in this paper, the effects of freely and simply supported boundary conditions on the frequencies and mode shapes of RC square slabs are studied. ANSYS software was employed to derive the finite element model to determine the natural frequencies and mode shapes of the slabs. Then, the obtained results through numerical analysis (finite element analysis) would be compared with an exact solution. The main goal of the research study is to predict how the boundary conditions change the behavior of the slab structures prior to performing experimental modal analysis. Based on the results, it is concluded that simply support boundary condition has obvious influence to increase the natural frequencies and change the shape of mode when it is compared with freely supported boundary condition of slabs. This means that such support conditions have direct influence on the dynamic behavior of the slabs. Thus, it is suggested to use free-free boundary condition in experimental modal analysis to precisely reflect the properties of the structure. By using free-free boundary conditions, the influence of poorly defined supports is interrupted.Keywords: natural frequencies, mode shapes, modal analysis, ANSYS software, RC slabs
Procedia PDF Downloads 4573946 Development of Star Image Simulator for Star Tracker Algorithm Validation
Authors: Zoubida Mahi
Abstract:
A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.Keywords: star tracker, star simulation, star detection, centroid, noise, scenario
Procedia PDF Downloads 963945 Simulation and Controller Tunning in a Photo-Bioreactor Applying by Taguchi Method
Authors: Hosein Ghahremani, MohammadReza Khoshchehre, Pejman Hakemi
Abstract:
This study involves numerical simulations of a vertical plate-type photo-bioreactor to investigate the performance of Microalgae Spirulina and Control and optimization of parameters for the digital controller by Taguchi method that MATLAB software and Qualitek-4 has been made. Since the addition of parameters such as temperature, dissolved carbon dioxide, biomass, and ... Some new physical parameters such as light intensity and physiological conditions like photosynthetic efficiency and light inhibitors are involved in biological processes, control is facing many challenges. Not only facilitate the commercial production photo-bioreactor Microalgae as feed for aquaculture and food supplements are efficient systems but also as a possible platform for the production of active molecules such as antibiotics or innovative anti-tumor agents, carbon dioxide removal and removal of heavy metals from wastewater is used. Digital controller is designed for controlling the light bioreactor until Microalgae growth rate and carbon dioxide concentration inside the bioreactor is investigated. The optimal values of the controller parameters of the S/N and ANOVA analysis software Qualitek-4 obtained With Reaction curve, Cohen-Con and Ziegler-Nichols method were compared. The sum of the squared error obtained for each of the control methods mentioned, the Taguchi method as the best method for controlling the light intensity was selected photo-bioreactor. This method compared to control methods listed the higher stability and a shorter interval to be answered.Keywords: photo-bioreactor, control and optimization, Light intensity, Taguchi method
Procedia PDF Downloads 3943944 Modal Analysis of Functionally Graded Materials Plates Using Finite Element Method
Authors: S. J. Shahidzadeh Tabatabaei, A. M. Fattahi
Abstract:
Modal analysis of an FGM plate composed of Al2O3 ceramic phase and 304 stainless steel metal phases was performed in this paper by ABAQUS software with the assumption that the behavior of material is elastic and mechanical properties (Young's modulus and density) are variable in the thickness direction of the plate. Therefore, a sub-program was written in FORTRAN programming language and was linked with ABAQUS software. For modal analysis, a finite element analysis was carried out similar to the model of other researchers and the accuracy of results was evaluated after comparing the results. Comparison of natural frequencies and mode shapes reflected the compatibility of results and optimal performance of the program written in FORTRAN as well as high accuracy of finite element model used in this research. After validation of the results, it was evaluated the effect of material (n parameter) on the natural frequency. In this regard, finite element analysis was carried out for different values of n and in simply supported mode. About the effect of n parameter that indicates the effect of material on the natural frequency, it was observed that the natural frequency decreased as n increased; because by increasing n, the share of ceramic phase on FGM plate has decreased and the share of steel phase has increased and this led to reducing stiffness of FGM plate and thereby reduce in the natural frequency. That is because the Young's modulus of Al2O3 ceramic is equal to 380 GPa and Young's modulus of SUS304 steel is 207 GPa.Keywords: FGM plates, modal analysis, natural frequency, finite element method
Procedia PDF Downloads 3913943 Molecular Comparison of HEV Isolates from Sewage & Humans at Western India
Authors: Nidhi S. Chandra, Veena Agrawal, Debprasad Chattopadhyay
Abstract:
Background: Hepatitis E virus (HEV) is a major cause of acute viral hepatitis in developing countries. It spreads feco orally mainly due to contamination of drinking water by sewage. There is limited data on the genotypic comparison of HEV isolates from sewage water and humans. The aim of this study was to identify genotype and conduct phylogenetic analysis of HEV isolates from sewage water and humans. Materials and Methods: 14 sewage water and 60 serum samples from acute sporadic hepatitis E cases (negative for hepatitis A, B, C) were tested for HEV-RNA by nested polymerase chain reaction (RTnPCR) using primers designed with in RdRp (RNA dependent RNA polymerase) region of open reading frame-1 (ORF-1). Sequencing was done by ABI prism 310. The sequences (343 nucleotides) were compared with each other and were aligned with previously reported HEV sequences obtained from GeneBank, using Clustal W software. A Phylogenetic tree was constructed by using PHYLIP version 3.67 software. Results: HEV-RNA was detected in 49/ 60 (81.67%) serum and 5/14 (35.71%) sewage samples. The sequences obtained from 17 serums and 2 sewage specimens belonged to genotype I with 85% similarity and clustering with previously reported human HEV sequences from India. HEV isolates from human and sewage in North West India are genetically closely related to each other. Conclusion: These finding suggest that sewage acts as reservoir of HEV. Therefore it is important that measures are taken for proper waste disposal and treatment of drinking water to prevent outbreaks and epidemics due to HEV.Keywords: hepatitis E virus, nested polymerase chain reaction, open reading frame-1, nucleotidies
Procedia PDF Downloads 3773942 Effects of Foliar Application of Glycine Betaine under Nickel Toxicity of Oat (Avena Sativa L.)
Authors: Khizar Hayat Bhatti, Fiza Javed, Misbah Zafar
Abstract:
Oat (Avena sativa L.) is a major cereal plant belonging to the family Poaceae. It is a very important source of carbohydrates, starch, minerals, vitamins and proteins that are beneficial for general health. Plants grow in the heavy metals contaminated soils that results in decline in growth. Glycine betaine application may improve plant growth, survival and resistance to metabolic disturbances due to stresses. Heavy metals, like nickels, have been accumulated for a long time in the soil because of industrial waste and sewage. The experiment was intended to alleviate the detrimental effects of heavy metal nickel stress on two oat varieties ‘Sgd-2011 and Hay’ using Glycine betain. Nickel was induced through soil application while GB was applied as foliar spray. After 10 days of nickel treatment, an exogenous spray of glycine betaine on the intact plant leaves. Data analysis was carried out using a Completely Randomized Design (CRD) with three replications in this study. For the analysis of all the data of the current research, Mini-Tab 19 software was used to compare the mean value of all treatments and Microsoft Excel software for generating the bars graphs. Significant accelerated plant growth was recorded when Ni exposed plants were treated with GB. Based on data findings, 3mM GB caused significant recovery from Ni stress doses. Overall results also demonstrated that the sgd-2011 variety of oats had the greatest outcomes for all parameters.Keywords: CRD, foliar spray method, glycine betaine, heavy metals, nickel, ROS
Procedia PDF Downloads 83941 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 2843940 Linkage Disequilibrium and Haplotype Blocks Study from Two High-Density Panels and a Combined Panel in Nelore Beef Cattle
Authors: Priscila A. Bernardes, Marcos E. Buzanskas, Luciana C. A. Regitano, Ricardo V. Ventura, Danisio P. Munari
Abstract:
Genotype imputation has been used to reduce genomic selections costs. In order to increase haplotype detection accuracy in methods that considers the linkage disequilibrium, another approach could be used, such as combined genotype data from different panels. Therefore, this study aimed to evaluate the linkage disequilibrium and haplotype blocks in two high-density panels before and after the imputation to a combined panel in Nelore beef cattle. A total of 814 animals were genotyped with the Illumina BovineHD BeadChip (IHD), wherein 93 animals (23 bulls and 70 progenies) were also genotyped with the Affymetrix Axion Genome-Wide BOS 1 Array Plate (AHD). After the quality control, 809 IHD animals (509,107 SNPs) and 93 AHD (427,875 SNPs) remained for analyses. The combined genotype panel (CP) was constructed by merging both panels after quality control, resulting in 880,336 SNPs. Imputation analysis was conducted using software FImpute v.2.2b. The reference (CP) and target (IHD) populations consisted of 23 bulls and 786 animals, respectively. The linkage disequilibrium and haplotype blocks studies were carried out for IHD, AHD, and imputed CP. Two linkage disequilibrium measures were considered; the correlation coefficient between alleles from two loci (r²) and the |D’|. Both measures were calculated using the software PLINK. The haplotypes' blocks were estimated using the software Haploview. The r² measurement presented different decay when compared to |D’|, wherein AHD and IHD had almost the same decay. For r², even with possible overestimation by the sample size for AHD (93 animals), the IHD presented higher values when compared to AHD for shorter distances, but with the increase of distance, both panels presented similar values. The r² measurement is influenced by the minor allele frequency of the pair of SNPs, which can cause the observed difference comparing the r² decay and |D’| decay. As a sum of the combinations between Illumina and Affymetrix panels, the CP presented a decay equivalent to a mean of these combinations. The estimated haplotype blocks detected for IHD, AHD, and CP were 84,529, 63,967, and 140,336, respectively. The IHD were composed by haplotype blocks with mean of 137.70 ± 219.05kb, the AHD with mean of 102.10kb ± 155.47, and the CP with mean of 107.10kb ± 169.14. The majority of the haplotype blocks of these three panels were composed by less than 10 SNPs, with only 3,882 (IHD), 193 (AHD) and 8,462 (CP) haplotype blocks composed by 10 SNPs or more. There was an increase in the number of chromosomes covered with long haplotypes when CP was used as well as an increase in haplotype coverage for short chromosomes (23-29), which can contribute for studies that explore haplotype blocks. In general, using CP could be an alternative to increase density and number of haplotype blocks, increasing the probability to obtain a marker close to a quantitative trait loci of interest.Keywords: Bos taurus indicus, decay, genotype imputation, single nucleotide polymorphism
Procedia PDF Downloads 2803939 A Programming Assessment Software Artefact Enhanced with the Help of Learners
Authors: Romeo A. Botes, Imelda Smit
Abstract:
The demands of an ever changing and complex higher education environment, along with the profile of modern learners challenge current approaches to assessment and feedback. More learners enter the education system every year. The younger generation expects immediate feedback. At the same time, feedback should be meaningful. The assessment of practical activities in programming poses a particular problem, since both lecturers and learners in the information and computer science discipline acknowledge that paper-based assessment for programming subjects lacks meaningful real-life testing. At the same time, feedback lacks promptness, consistency, comprehensiveness and individualisation. Most of these aspects may be addressed by modern, technology-assisted assessment. The focus of this paper is the continuous development of an artefact that is used to assist the lecturer in the assessment and feedback of practical programming activities in a senior database programming class. The artefact was developed using three Design Science Research cycles. The first implementation allowed one programming activity submission per assessment intervention. This pilot provided valuable insight into the obstacles regarding the implementation of this type of assessment tool. A second implementation improved the initial version to allow multiple programming activity submissions per assessment. The focus of this version is on providing scaffold feedback to the learner – allowing improvement with each subsequent submission. It also has a built-in capability to provide the lecturer with information regarding the key problem areas of each assessment intervention.Keywords: programming, computer-aided assessment, technology-assisted assessment, programming assessment software, design science research, mixed-method
Procedia PDF Downloads 2963938 Reconstruction Spectral Reflectance Cube Based on Artificial Neural Network for Multispectral Imaging System
Authors: Iwan Cony Setiadi, Aulia M. T. Nasution
Abstract:
The multispectral imaging (MSI) technique has been used for skin analysis, especially for distant mapping of in-vivo skin chromophores by analyzing spectral data at each reflected image pixel. For ergonomic purpose, our multispectral imaging system is decomposed in two parts: a light source compartment based on LED with 11 different wavelenghts and a monochromatic 8-Bit CCD camera with C-Mount Objective Lens. The software based on GUI MATLAB to control the system was also developed. Our system provides 11 monoband images and is coupled with a software reconstructing hyperspectral cubes from these multispectral images. In this paper, we proposed a new method to build a hyperspectral reflectance cube based on artificial neural network algorithm. After preliminary corrections, a neural network is trained using the 32 natural color from X-Rite Color Checker Passport. The learning procedure involves acquisition, by a spectrophotometer. This neural network is then used to retrieve a megapixel multispectral cube between 380 and 880 nm with a 5 nm resolution from a low-spectral-resolution multispectral acquisition. As hyperspectral cubes contain spectra for each pixel; comparison should be done between the theoretical values from the spectrophotometer and the reconstructed spectrum. To evaluate the performance of reconstruction, we used the Goodness of Fit Coefficient (GFC) and Root Mean Squared Error (RMSE). To validate reconstruction, the set of 8 colour patches reconstructed by our MSI system and the one recorded by the spectrophotometer were compared. The average GFC was 0.9990 (standard deviation = 0.0010) and the average RMSE is 0.2167 (standard deviation = 0.064).Keywords: multispectral imaging, reflectance cube, spectral reconstruction, artificial neural network
Procedia PDF Downloads 3223937 The Role of Information Technology in Supply Chain Management
Authors: V. Jagadeesh, K. Venkata Subbaiah, P. Govinda Rao
Abstract:
This paper explaining about the significance of information technology tools and software packages in supply chain management (SCM) in order to manage the entire supply chain. Managing materials flow and financial flow and information flow effectively and efficiently with the aid of information technology tools and packages in order to deliver right quantity with right quality of goods at right time by using right methods and technology. Information technology plays a vital role in streamlining the sales forecasting and demand planning and Inventory control and transportation in supply networks and finally deals with production planning and scheduling. It achieves the objectives by streamlining the business process and integrates within the enterprise and its extended enterprise. SCM starts with customer and it involves sequence of activities from customer, retailer, distributor, manufacturer and supplier within the supply chain framework. It is the process of integrating demand planning and supply network planning and production planning and control. Forecasting indicates the direction for planning raw materials in order to meet the production planning requirements. Inventory control and transportation planning allocate the optimal or economic order quantity by utilizing shortest possible routes to deliver the goods to the customer. Production planning and control utilize the optimal resources mix in order to meet the capacity requirement planning. The above operations can be achieved by using appropriate information technology tools and software packages for the supply chain management.Keywords: supply chain management, information technology, business process, extended enterprise
Procedia PDF Downloads 3773936 Effective Dose and Size Specific Dose Estimation with and without Tube Current Modulation for Thoracic Computed Tomography Examinations: A Phantom Study
Authors: S. Gharbi, S. Labidi, M. Mars, M. Chelli, F. Ladeb
Abstract:
The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03<0.05). The highest one is measured on images acquired with TCM and reconstructed with Filtered back projection (FBP). In conclusion, this study proves the potential of TCM technique in SSDE and ED reduction and in conserving image quality with high diagnostic reference level for thoracic CT examinations.Keywords: anthropomorphic phantom, computed tomography, CT-expo, radiation dose
Procedia PDF Downloads 2213935 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method
Authors: Angel G. De Leon Hernandez
Abstract:
A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming
Procedia PDF Downloads 1223934 Spatial Variation of Nitrogen, Phosphorus and Potassium Contents of Tomato (Solanum lycopersicum L.) Plants Grown in Greenhouses (Springs) in Elmali-Antalya Region
Authors: Namik Kemal Sonmez, Sahriye Sonmez, Hasan Rasit Turkkan, Hatice Tuba Selcuk
Abstract:
In this study, the spatial variation of plant and soil nutrition contents of tomato plants grown in greenhouses was investigated in Elmalı region of Antalya. For this purpose, total of 19 sampling points were determined. Coordinates of each sampling points were recorded by using a hand-held GPS device and were transferred to satellite data in GIS. Soil samples were collected from two different depths, 0-20 and 20-40 cm, and leaf were taken from different tomato greenhouses. The soil and plant samples were analyzed for N, P and K. Then, attribute tables were created with the analyses results by using GIS. Data were analyzed and semivariogram models and parameters (nugget, sill and range) of variables were determined by using GIS software. Kriged maps of variables were created by using nugget, sill and range values with geostatistical extension of ArcGIS software. Kriged maps of the N, P and K contents of plant and soil samples showed patchy or a relatively smooth distribution in the study areas. As a result, the N content of plants were sufficient approximately 66% portion of the tomato productions. It was determined that the P and K contents were sufficient of 70% and 80% portion of the areas, respectively. On the other hand, soil total K contents were generally adequate and available N and P contents were found to be highly good enough in two depths (0-20 and 20-40 cm) 90% portion of the areas.Keywords: Elmali, nutrients, springs greenhouses, spatial variation, tomato
Procedia PDF Downloads 2433933 Navigating Construction Project Outcomes: Synergy Through the Evolution of Digital Innovation and Strategic Management
Authors: Derrick Mirindi, Frederic Mirindi, Oluwakemi Oshineye
Abstract:
The ongoing high rate of construction project failures worldwide is often blamed on the difficulties of managing stakeholders. This highlights the crucial role of strategic management (SM) in achieving project success. This study investigates how integrating digital tools into the SM framework can effectively address stakeholder-related challenges. This work specifically focuses on the impact of evolving digital tools, such as Project Management Software (PMS) (e.g., Basecamp and Wrike), Building Information Modeling (BIM) (e.g., Tekla BIMsight and Autodesk Navisworks), Virtual and Augmented Reality (VR/AR) (e.g., Microsoft HoloLens), drones and remote monitoring, and social media and Web-Based platforms, in improving stakeholder engagement and project outcomes. Through existing literature with examples of failed projects, the study highlights how the evolution of digital tools will serve as facilitators within the strategic management process. These tools offer benefits such as real-time data access, enhanced visualization, and more efficient workflows to mitigate stakeholder challenges in construction projects. The findings indicate that integrating digital tools with SM principles effectively addresses stakeholder challenges, resulting in improved project outcomes and stakeholder satisfaction. The research advocates for a combined approach that embraces both strategic management and digital innovation to navigate the complex stakeholder landscape in construction projects.Keywords: strategic management, digital tools, virtual and augmented reality, stakeholder management, building information modeling, project management software
Procedia PDF Downloads 833932 The Impact of Regulatory Changes on the Development of Mobile Medical Apps
Abstract:
Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.Keywords: agile, applications, FDA, medical, mobile, regulations, software engineering, standards
Procedia PDF Downloads 3603931 Modeling Palm Oil Quality During the Ripening Process of Fresh Fruits
Authors: Afshin Keshvadi, Johari Endan, Haniff Harun, Desa Ahmad, Farah Saleena
Abstract:
Experiments were conducted to develop a model for analyzing the ripening process of oil palm fresh fruits in relation to oil yield and oil quality of palm oil produced. This research was carried out on 8-year-old Tenera (Dura × Pisifera) palms planted in 2003 at the Malaysian Palm Oil Board Research Station. Fresh fruit bunches were harvested from designated palms during January till May of 2010. The bunches were divided into three regions (top, middle and bottom), and fruits from the outer and inner layers were randomly sampled for analysis at 8, 12, 16 and 20 weeks after anthesis to establish relationships between maturity and oil development in the mesocarp and kernel. Computations on data related to ripening time, oil content and oil quality were performed using several computer software programs (MSTAT-C, SAS and Microsoft Excel). Nine nonlinear mathematical models were utilized using MATLAB software to fit the data collected. The results showed mean mesocarp oil percent increased from 1.24 % at 8 weeks after anthesis to 29.6 % at 20 weeks after anthesis. Fruits from the top part of the bunch had the highest mesocarp oil content of 10.09 %. The lowest kernel oil percent of 0.03 % was recorded at 12 weeks after anthesis. Palmitic acid and oleic acid comprised of more than 73 % of total mesocarp fatty acids at 8 weeks after anthesis, and increased to more than 80 % at fruit maturity at 20 weeks. The Logistic model with the highest R2 and the lowest root mean square error was found to be the best fit model.Keywords: oil palm, oil yield, ripening process, anthesis, fatty acids, modeling
Procedia PDF Downloads 3133930 Development of Methods for Plastic Injection Mold Weight Reduction
Authors: Bita Mohajernia, R. J. Urbanic
Abstract:
Mold making techniques have focused on meeting the customers’ functional and process requirements; however, today, molds are increasing in size and sophistication, and are difficult to manufacture, transport, and set up due to their size and mass. Presently, mold weight saving techniques focus on pockets to reduce the mass of the mold, but the overall size is still large, which introduces costs related to the stock material purchase, processing time for process planning, machining and validation, and excess waste materials. Reducing the overall size of the mold is desirable for many reasons, but the functional requirements, tool life, and durability cannot be compromised in the process. It is proposed to use Finite Element Analysis simulation tools to model the forces, and pressures to determine where the material can be removed. The potential results of this project will reduce manufacturing costs. In this study, a light weight structure is defined by an optimal distribution of material to carry external loads. The optimization objective of this research is to determine methods to provide the optimum layout for the mold structure. The topology optimization method is utilized to improve structural stiffness while decreasing the weight using the OptiStruct software. The optimized CAD model is compared with the primary geometry of the mold from the NX software. Results of optimization show an 8% weight reduction while the actual performance of the optimized structure, validated by physical testing, is similar to the original structure.Keywords: finite element analysis, plastic injection molding, topology optimization, weight reduction
Procedia PDF Downloads 2903929 Digital Platform for Psychological Assessment Supported by Sensors and Efficiency Algorithms
Authors: Francisco M. Silva
Abstract:
Technology is evolving, creating an impact on our everyday lives and the telehealth industry. Telehealth encapsulates the provision of healthcare services and information via a technological approach. There are several benefits of using web-based methods to provide healthcare help. Nonetheless, few health and psychological help approaches combine this method with wearable sensors. This paper aims to create an online platform for users to receive self-care help and information using wearable sensors. In addition, researchers developing a similar project obtain a solid foundation as a reference. This study provides descriptions and analyses of the software and hardware architecture. Exhibits and explains a heart rate dynamic and efficient algorithm that continuously calculates the desired sensors' values. Presents diagrams that illustrate the website deployment process and the webserver means of handling the sensors' data. The goal is to create a working project using Arduino compatible hardware. Heart rate sensors send their data values to an online platform. A microcontroller board uses an algorithm to calculate the sensor heart rate values and outputs it to a web server. The platform visualizes the sensor's data, summarizes it in a report, and creates alerts for the user. Results showed a solid project structure and communication from the hardware and software. The web server displays the conveyed heart rate sensor's data on the online platform, presenting observations and evaluations.Keywords: Arduino, heart rate BPM, microcontroller board, telehealth, wearable sensors, web-based healthcare
Procedia PDF Downloads 126