Search results for: intelligent methods
15088 Genomic Prediction Reliability Using Haplotypes Defined by Different Methods
Authors: Sohyoung Won, Heebal Kim, Dajeong Lim
Abstract:
Genomic prediction is an effective way to measure the abilities of livestock for breeding based on genomic estimated breeding values, statistically predicted values from genotype data using best linear unbiased prediction (BLUP). Using haplotypes, clusters of linked single nucleotide polymorphisms (SNPs), as markers instead of individual SNPs can improve the reliability of genomic prediction since the probability of a quantitative trait loci to be in strong linkage disequilibrium (LD) with markers is higher. To efficiently use haplotypes in genomic prediction, finding optimal ways to define haplotypes is needed. In this study, 770K SNP chip data was collected from Hanwoo (Korean cattle) population consisted of 2506 cattle. Haplotypes were first defined in three different ways using 770K SNP chip data: haplotypes were defined based on 1) length of haplotypes (bp), 2) the number of SNPs, and 3) k-medoids clustering by LD. To compare the methods in parallel, haplotypes defined by all methods were set to have comparable sizes; in each method, haplotypes defined to have an average number of 5, 10, 20 or 50 SNPs were tested respectively. A modified GBLUP method using haplotype alleles as predictor variables was implemented for testing the prediction reliability of each haplotype set. Also, conventional genomic BLUP (GBLUP) method, which uses individual SNPs were tested to evaluate the performance of the haplotype sets on genomic prediction. Carcass weight was used as the phenotype for testing. As a result, using haplotypes defined by all three methods showed increased reliability compared to conventional GBLUP. There were not many differences in the reliability between different haplotype defining methods. The reliability of genomic prediction was highest when the average number of SNPs per haplotype was 20 in all three methods, implying that haplotypes including around 20 SNPs can be optimal to use as markers for genomic prediction. When the number of alleles generated by each haplotype defining methods was compared, clustering by LD generated the least number of alleles. Using haplotype alleles for genomic prediction showed better performance, suggesting improved accuracy in genomic selection. The number of predictor variables was decreased when the LD-based method was used while all three haplotype defining methods showed similar performances. This suggests that defining haplotypes based on LD can reduce computational costs and allows efficient prediction. Finding optimal ways to define haplotypes and using the haplotype alleles as markers can provide improved performance and efficiency in genomic prediction.Keywords: best linear unbiased predictor, genomic prediction, haplotype, linkage disequilibrium
Procedia PDF Downloads 14115087 Dynamic Construction Site Layout Using Ant Colony Optimization
Authors: Yassir AbdelRazig
Abstract:
Evolutionary optimization methods such as genetic algorithms have been used extensively for the construction site layout problem. More recently, ant colony optimization algorithms, which are evolutionary methods based on the foraging behavior of ants, have been successfully applied to benchmark combinatorial optimization problems. This paper proposes a formulation of the site layout problem in terms of a sequencing problem that is suitable for solution using an ant colony optimization algorithm. In the construction industry, site layout is a very important planning problem. The objective of site layout is to position temporary facilities both geographically and at the correct time such that the construction work can be performed satisfactorily with minimal costs and improved safety and working environment. During the last decade, evolutionary methods such as genetic algorithms have been used extensively for the construction site layout problem. This paper proposes an ant colony optimization model for construction site layout. A simple case study for a highway project is utilized to illustrate the application of the model.Keywords: ant colony, construction site layout, optimization, genetic algorithms
Procedia PDF Downloads 38315086 Effect of Dehydration Methods of the Proximate Composition, Mineral Content and Functional Properties of Starch Flour Extracted from Maize
Authors: Olakunle M. Makanjuola, Adebola Ajayi
Abstract:
Effect of the dehydrated method on proximate, functional and mineral properties of corn starch was evaluated. The study was carried and to determine the proximate, functional and mineral properties of corn starch produced using three different drying methods namely (sun) (oven) and (cabinet) drying methods. The corn starch was obtained by cleaning, steeping, milling, sieving, dewatering and drying corn starch was evaluated for proximate composition, functional properties, and mineral properties to determine the nutritional properties, moisture, crude protein, crude fat, ash, and carbohydrate were in the range of 9.35 to 12.16, 6.5 to 10.78 1.08 to 2.5, 1.08 to 2.5, 4.0 to 5.2, 69.58 to 75.8% respectively. Bulk density range between 0.610g/dm3 to 0.718 g/dm3, water, and oil absorption capacities range between 116.5 to 117.25 and 113.8 to 117.25 ml/g respectively. Swelling powder had value varying from 1.401 to 1.544g/g respectively. The results indicate that the cabinet method had the best result item of the quality attribute.Keywords: starch flour, maize, dehydration, cabinet dryer
Procedia PDF Downloads 23815085 A Comparison of Sequential Quadratic Programming, Genetic Algorithm, Simulated Annealing, Particle Swarm Optimization for the Design and Optimization of a Beam Column
Authors: Nima Khosravi
Abstract:
This paper describes an integrated optimization technique with concurrent use of sequential quadratic programming, genetic algorithm, and simulated annealing particle swarm optimization for the design and optimization of a beam column. In this research, the comparison between 4 different types of optimization methods. The comparison is done and it is found out that all the methods meet the required constraints and the lowest value of the objective function is achieved by SQP, which was also the fastest optimizer to produce the results. SQP is a gradient based optimizer hence its results are usually the same after every run. The only thing which affects the results is the initial conditions given. The initial conditions given in the various test run were very large as compared. Hence, the value converged at a different point. Rest of the methods is a heuristic method which provides different values for different runs even if every parameter is kept constant.Keywords: beam column, genetic algorithm, particle swarm optimization, sequential quadratic programming, simulated annealing
Procedia PDF Downloads 38615084 Performance Comparison of AODV and Soft AODV Routing Protocol
Authors: Abhishek, Seema Devi, Jyoti Ohri
Abstract:
A mobile ad hoc network (MANET) represents a system of wireless mobile nodes that can self-organize freely and dynamically into arbitrary and temporary network topology. Unlike a wired network, wireless network interface has limited transmission range. Routing is the task of forwarding data packets from source to a given destination. Ad-hoc On Demand Distance Vector (AODV) routing protocol creates a path for a destination only when it required. This paper describes the implementation of AODV routing protocol using MATLAB-based Truetime simulator. In MANET's node movements are not fixed while they are random in nature. Hence intelligent techniques i.e. fuzzy and ANFIS are used to optimize the transmission range. In this paper, we compared the transmission range of AODV, fuzzy AODV and ANFIS AODV. For soft computing AODV, we have taken transmitted power and received threshold as input and transmission range as output. ANFIS gives better results as compared to fuzzy AODV.Keywords: ANFIS, AODV, fuzzy, MANET, reactive routing protocol, routing protocol, truetime
Procedia PDF Downloads 49815083 On Block Vandermonde Matrix Constructed from Matrix Polynomial Solvents
Authors: Malika Yaici, Kamel Hariche
Abstract:
In control engineering, systems described by matrix fractions are studied through properties of block roots, also called solvents. These solvents are usually dealt with in a block Vandermonde matrix form. Inverses and determinants of Vandermonde matrices and block Vandermonde matrices are used in solving problems of numerical analysis in many domains but require costly computations. Even though Vandermonde matrices are well known and method to compute inverse and determinants are many and, generally, based on interpolation techniques, methods to compute the inverse and determinant of a block Vandermonde matrix have not been well studied. In this paper, some properties of these matrices and iterative algorithms to compute the determinant and the inverse of a block Vandermonde matrix are given. These methods are deducted from the partitioned matrix inversion and determinant computing methods. Due to their great size, parallelization may be a solution to reduce the computations cost, so a parallelization of these algorithms is proposed and validated by a comparison using algorithmic complexity.Keywords: block vandermonde matrix, solvents, matrix polynomial, matrix inverse, matrix determinant, parallelization
Procedia PDF Downloads 24015082 Between Efficacy and Danger: Narratives of Female University Students about Emergency Contraception Methods
Authors: Anthony Idowu Ajayi, Ezebunwa Ethelbert Nwokocha, Wilson Akpan, Oladele Vincent Adeniyi
Abstract:
Studies on emergency contraception (EC) mostly utilise quantitative methods and focus on medically approved drugs for the prevention of unwanted pregnancies. This methodological bias necessarily obscures insider perspectives on sexual behaviour, particularly on why specific methods are utilized by women who seek to prevent unplanned pregnancies. In order to privilege this perspective, with a view to further enriching the discourse and policy on the prevention and management of unplanned pregnancies, this paper brings together the findings from several focus groups and in-depth interviews conducted amongst unmarried female undergraduate students in two Nigerian universities. The study found that while the research participants had good knowledge of the consequences of unprotected sexual intercourses - with abstinence and condom widely used - participants’ willingness to rely only on medically sound measures to prevent unwanted pregnancies was not always mediated by such knowledge. Some of the methods favored by participants appeared to be those commonly associated with people of low socio-economic status in the society where the study was conducted. Medically unsafe concoctions, some outright dangerous, were widely believed to be efficacious in preventing unwanted pregnancy. Furthermore, respondents’ narratives about their sexual behaviour revealed that inadequate sex education, socio-economic pressures, and misconceptions about the efficacy of “crude” emergency contraception methods were all interrelated. The paper therefore suggests that these different facets of the unplanned pregnancy problem should be the focus of intervention.Keywords: unplanned pregnancy, unsafe abortion, emergency contraception, concoctions
Procedia PDF Downloads 42415081 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor
Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric
Abstract:
Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.Keywords: car-detector, HOG, motion, computing time
Procedia PDF Downloads 32315080 Sleep Apnea Hypopnea Syndrom Diagnosis Using Advanced ANN Techniques
Authors: Sachin Singh, Thomas Penzel, Dinesh Nandan
Abstract:
Accurate identification of Sleep Apnea Hypopnea Syndrom Diagnosis is difficult problem for human expert because of variability among persons and unwanted noise. This paper proposes the diagonosis of Sleep Apnea Hypopnea Syndrome (SAHS) using airflow, ECG, Pulse and SaO2 signals. The features of each type of these signals are extracted using statistical methods and ANN learning methods. These extracted features are used to approximate the patient's Apnea Hypopnea Index(AHI) using sample signals in model. Advance signal processing is also applied to snore sound signal to locate snore event and SaO2 signal is used to support whether determined snore event is true or noise. Finally, Apnea Hypopnea Index (AHI) event is calculated as per true snore event detected. Experiment results shows that the sensitivity can reach up to 96% and specificity to 96% as AHI greater than equal to 5.Keywords: neural network, AHI, statistical methods, autoregressive models
Procedia PDF Downloads 11915079 Machine Vision System for Measuring the Quality of Bulk Sun-dried Organic Raisins
Authors: Navab Karimi, Tohid Alizadeh
Abstract:
An intelligent vision-based system was designed to measure the quality and purity of raisins. A machine vision setup was utilized to capture the images of bulk raisins in ranges of 5-50% mixed pure-impure berries. The textural features of bulk raisins were extracted using Grey-level Histograms, Co-occurrence Matrix, and Local Binary Pattern (a total of 108 features). Genetic Algorithm and neural network regression were used for selecting and ranking the best features (21 features). As a result, the GLCM features set was found to have the highest accuracy (92.4%) among the other sets. Followingly, multiple feature combinations of the previous stage were fed into the second regression (linear regression) to increase accuracy, wherein a combination of 16 features was found to be the optimum. Finally, a Support Vector Machine (SVM) classifier was used to differentiate the mixtures, producing the best efficiency and accuracy of 96.2% and 97.35%, respectively.Keywords: sun-dried organic raisin, genetic algorithm, feature extraction, ann regression, linear regression, support vector machine, south azerbaijan.
Procedia PDF Downloads 7315078 Producing TPU/Propolis Nanofibrous Membrane as Wound Dressing
Authors: Yasin Akgül, Yusuf Polat, Emine Canbay, Ali Kılıç
Abstract:
Wound dressings have strategically and economic importance considering increase of chronic wounds in the world. In this study, TPU nanofibrous membranes containing propolis as wound dressing are produced by two different methods. Firstly, TPU solution and propolis extract were mixed and this solution was electrospun. The other method is that TPU/propolis blend was centrifugally spun. Properties of nanofibrous membranes obtained by these methods were compared. While realizing the experiments, both systems were optimized to produce nanofibers with nearly same average fiber diameter.Keywords: nanofiber, wound dressing, electrospinning, centrifugal spinning
Procedia PDF Downloads 45515077 Hybrid Subspace Approach for Time Delay Estimation in MIMO Systems
Authors: Mojtaba Saeedinezhad, Sarah Yousefi
Abstract:
In this paper, we present a hybrid subspace approach for Time Delay Estimation (TDE) in multivariable systems. While several methods have been proposed for time delay estimation in SISO systems, delay estimation in MIMO systems were always a big challenge. In these systems the existing TDE methods have significant limitations because most of procedures are just based on system response estimation or correlation analysis. We introduce a new hybrid method for TDE in MIMO systems based on subspace identification and explicit output error method; and compare its performance with previously introduced procedures in presence of different noise levels and in a statistical manner. Then the best method is selected with multi objective decision making technique. It is shown that the performance of new approach is much better than the existing methods, even in low signal-to-noise conditions.Keywords: system identification, time delay estimation, ARX, OE, merit ratio, multi variable decision making
Procedia PDF Downloads 34615076 Automatic Diagnosis of Electrical Equipment Using Infrared Thermography
Authors: Y. Laib Dit Leksir, S. Bouhouche
Abstract:
Analysis and processing of data bases resulting from infrared thermal measurements made on the electrical installation requires the development of new tools in order to obtain correct and additional information to the visual inspections. Consequently, the methods based on the capture of infrared digital images show a great potential and are employed increasingly in various fields. Although, there is an enormous need for the development of effective techniques to analyse these data base in order to extract relevant information relating to the state of the equipments. Our goal consists in introducing recent techniques of modeling based on new methods, image and signal processing to develop mathematical models in this field. The aim of this work is to capture the anomalies existing in electrical equipments during an inspection of some machines using A40 Flir camera. After, we use binarisation techniques in order to select the region of interest and we make comparison between these methods of thermal images obtained to choose the best one.Keywords: infrared thermography, defect detection, troubleshooting, electrical equipment
Procedia PDF Downloads 47615075 Advanced Particle Characterisation of Suspended Sediment in the Danube River Using Automated Imaging and Laser Diffraction
Authors: Flóra Pomázi, Sándor Baranya, Zoltán Szalai
Abstract:
A harmonized monitoring of the suspended sediment transport along such a large river as the world’s most international river, the Danube River, is a rather challenging task. The traditional monitoring method in Hungary is obsolete but using indirect measurement devices and techniques like optical backscatter sensors (OBS), laser diffraction or acoustic backscatter sensors (ABS) could provide a fast and efficient alternative option of direct methods. However, these methods are strongly sensitive to the particle characteristics (i.e. particle shape, particle size and mineral composition). The current method does not provide sufficient information about particle size distribution, mineral analysis is rarely done, and the shape of the suspended sediment particles have not been examined yet. The aims of the study are (1) to determine the particle characterisation of suspended sediment in the Danube River using advanced particle characterisation methods as laser diffraction and automated imaging, and (2) to perform a sensitivity analysis of the indirect methods in order to determine the impact of suspended particle characteristics. The particle size distribution is determined by laser diffraction. The particle shape and mineral composition analysis is done by the Morphologi G3ID image analyser. The investigated indirect measurement devices are the LISST-Portable|XR, the LISST-ABS (Sequoia Inc.) and the Rio Grande 1200 kHz ADCP (Teledyne Marine). The major findings of this study are (1) the statistical shape of the suspended sediment particle - this is the first research in this context, (2) the actualised particle size distribution – that can be compared to historical information, so that the morphological changes can be tracked, (3) the actual mineral composition of the suspended sediment in the Danube River, and (4) the reliability of the tested indirect methods has been increased – based on the results of the sensitivity analysis and the previous findings.Keywords: advanced particle characterisation, automated imaging, indirect methods, laser diffraction, mineral composition, suspended sediment
Procedia PDF Downloads 14615074 Digital Twin Platform for BDS-3 Satellite Navigation Using Digital Twin Intelligent Visualization Technology
Authors: Rundong Li, Peng Wu, Junfeng Zhang, Zhipeng Ren, Chen Yang, Jiahui Gan, Lu Feng, Haibo Tong, Xuemei Xiao, Yuying Chen
Abstract:
The research of Beidou-3 satellite navigation is on the rise, but in actual work, it is inevitable that satellite data is insecure, research and development is inefficient, and there is no ability to deal with failures in advance. Digital twin technology has obvious advantages in the simulation of life cycle models of aerospace satellite navigation products. In order to meet the increasing demand, this paper builds a Beidou-3 satellite navigation digital twin platform (BDSDTP). The basic establishment of BDSDTP was completed by establishing a digital twin double, Beidou-3 comprehensive digital twin design, predictive maintenance (PdM) mathematical model, and visual interaction design. Finally, this paper provides a time application case of the platform, which provides a reference for the application of BDSDTP in various fields of navigation and provides obvious help for extending the full cycle life of Beidou-3 satellite navigation.Keywords: BDS-3, digital twin, visualization, PdM
Procedia PDF Downloads 14215073 Comparative Analysis of Glycated Hemoglobin (hba1c) Between HPLC and Immunoturbidimetry Method in Type II Diabetes Mellitus Patient
Authors: Intanri Kurniati, Raja Iqbal Mulya Harahap, Agustyas Tjiptaningrum, Reni Zuraida
Abstract:
Background: Diabetes mellitus is still increasing and has become a health and social burden in the world. It is known that glycation among various proteins is increased in diabetic patients compared with non-diabetic subjects. Some of these glycated proteins are suggested to be involved in the development and progression of chronic diabetic complications. Among these glycated proteins, glycated hemoglobin (HbA1C) is commonly used as the gold standard index of glycemic control in the clinical setting. HbA1C testing has some methods, and the most commonly used is immunoturbidimetry. This research aimed to compare the HbA1c level between immunoturbidimetry and HbA1C level in T2DM patients. Methods: This research involves 77 patients from Abd Muluk Hospital Bandar Lampung; the patient was asked for consent in this research, then underwent phlebotomy to be tested for HbA1C; the sample was then examined for HbA1C with Turbidimetric Inhibition Immunoassay (TINIA) and High-Performance Liquid Chromatography (HPLC) method. Result: Mean± SD of the samples with the TINIA method was 9.2±1,2; meanwhile, the level HbA1C with the HPLC method is 9.6±1,2. The t-test showed no significant difference between the group subjects. (p<0.05). It was proposed that the two methods have high suitability in testing, and both are eligibly used for the patient. Discussion: There was no significant difference among research subjects, indicating that the high conformity of the two methods is suitable to be used for monitoring patients clinically. Conclusion: There is increasing in HbA1C level in a patient with T2DM measured with HPLC and or Turbidimetric Inhibition Immunoassay (TINIA) method, and there were no significant differences among those methods.Keywords: diabetes mellitus, glycated albumin, HbA1C, HPLC, immunoturbidimetry
Procedia PDF Downloads 9915072 Crossing Multi-Source Climate Data to Estimate the Effects of Climate Change on Evapotranspiration Data: Application to the French Central Region
Authors: Bensaid A., Mostephaoui T., Nedjai R.
Abstract:
Climatic factors are the subject of considerable research, both methodologically and instrumentally. Under the effect of climate change, the approach to climate parameters with precision remains one of the main objectives of the scientific community. This is from the perspective of assessing climate change and its repercussions on humans and the environment. However, many regions of the world suffer from a severe lack of reliable instruments that can make up for this deficit. Alternatively, the use of empirical methods becomes the only way to assess certain parameters that can act as climate indicators. Several scientific methods are used for the evaluation of evapotranspiration which leads to its evaluation either directly at the level of the climatic stations or by empirical methods. All these methods make a point approach and, in no case, allow the spatial variation of this parameter. We, therefore, propose in this paper the use of three sources of information (network of weather stations of Meteo France, World Databases, and Moodis satellite images) to evaluate spatial evapotranspiration (ETP) using the Turc method. This first step will reflect the degree of relevance of the indirect (satellite) methods and their generalization to sites without stations. The spatial variation representation of this parameter using the geographical information system (GIS) accounts for the heterogeneity of the behaviour of this parameter. This heterogeneity is due to the influence of site morphological factors and will make it possible to appreciate the role of certain topographic and hydrological parameters. A phase of predicting the evolution over the medium and long term of evapotranspiration under the effect of climate change by the application of the Intergovernmental Panel on Climate Change (IPCC) scenarios gives a realistic overview as to the contribution of aquatic systems to the scale of the region.Keywords: climate change, ETP, MODIS, GIEC scenarios
Procedia PDF Downloads 10015071 Big Data-Driven Smart Policing: Big Data-Based Patrol Car Dispatching in Abu Dhabi, UAE
Authors: Oualid Walid Ben Ali
Abstract:
Big Data has become one of the buzzwords today. The recent explosion of digital data has led the organization, either private or public, to a new era towards a more efficient decision making. At some point, business decided to use that concept in order to learn what make their clients tick with phrases like ‘sales funnel’ analysis, ‘actionable insights’, and ‘positive business impact’. So, it stands to reason that Big Data was viewed through green (read: money) colored lenses. Somewhere along the line, however someone realized that collecting and processing data doesn’t have to be for business purpose only, but also could be used for other purposes to assist law enforcement or to improve policing or in road safety. This paper presents briefly, how Big Data have been used in the fields of policing order to improve the decision making process in the daily operation of the police. As example, we present a big-data driven system which is sued to accurately dispatch the patrol cars in a geographic environment. The system is also used to allocate, in real-time, the nearest patrol car to the location of an incident. This system has been implemented and applied in the Emirate of Abu Dhabi in the UAE.Keywords: big data, big data analytics, patrol car allocation, dispatching, GIS, intelligent, Abu Dhabi, police, UAE
Procedia PDF Downloads 49015070 Gamification Using Stochastic Processes: Engage Children to Have Healthy Habits
Authors: Andre M. Carvalho, Pedro Sebastiao
Abstract:
This article is based on a dissertation that intends to analyze and make a model, intelligently, algorithms based on stochastic processes of a gamification application applied to marketing. Gamification is used in our daily lives to engage us to perform certain actions in order to achieve goals and gain rewards. This strategy is an increasingly adopted way to encourage and retain customers through game elements. The application of gamification aims to encourage children between 6 and 10 years of age to have healthy habits and the purpose of serving as a model for use in marketing. This application was developed in unity; we implemented intelligent algorithms based on stochastic processes, web services to respond to all requests of the application, a back-office website to manage the application and the database. The behavioral analysis of the use of game elements and stochastic processes in children’s motivation was done. The application of algorithms based on stochastic processes in-game elements is very important to promote cooperation and to ensure fair and friendly competition between users which consequently stimulates the user’s interest and their involvement in the application and organization.Keywords: engage, games, gamification, randomness, stochastic processes
Procedia PDF Downloads 33115069 Analytical Solving of Nonlinear Differential Equations in the Nonlinear Phenomena for Viscos Fluids
Authors: Arash Jafari, Mehdi Taghaddosi, Azin Parvin
Abstract:
In the paper, our purpose is to enhance the ability to solve a nonlinear differential equation which is about the motion of an incompressible fluid flow going down of an inclined plane without thermal effect with a simple and innovative approach which we have named it new method. Comparisons are made amongst the Numerical, new method, and HPM methods, and the results reveal that this method is very effective and simple and can be applied to other nonlinear problems. It is noteworthy that there are some valuable advantages in this way of solving differential equations, and also most of the sets of differential equations can be answered in this manner which in the other methods they do not have acceptable solutions up to now. A summary of the excellence of this method in comparison to the other manners is as follows: 1) Differential equations are directly solvable by this method. 2) Without any dimensionless procedure, we can solve equation(s). 3) It is not necessary to convert variables into new ones. According to the afore-mentioned assertions which will be proved in this case study, the process of solving nonlinear equation(s) will be very easy and convenient in comparison to the other methods.Keywords: viscos fluid, incompressible fluid flow, inclined plane, nonlinear phenomena
Procedia PDF Downloads 28315068 Comparison of Safety Factor Evaluation Methods for Buckling of High Strength Steel Welded Box Section Columns
Authors: Balazs Somodi, Balazs Kovesdi
Abstract:
In the research praxis of civil engineering the statistical evaluation of experimental and numerical investigations is an essential task in order to compare the experimental and numerical resistances of a specific structural problem with the proposed resistances of the standards. However, in the standards and in the international literature there are several different safety factor evaluation methods that can be used to check the necessary safety level (e.g.: 5% quantile level, 2.3% quantile level, 1‰ quantile level, γM partial safety factor, γM* partial safety factor, β reliability index). Moreover, in the international literature different calculation methods could be found even for the same safety factor as well. In the present study the flexural buckling resistance of high strength steel (HSS) welded closed sections are analyzed. The authors investigated the flexural buckling resistances of the analyzed columns by laboratory experiments. In the present study the safety levels of the obtained experimental resistances are calculated based on several safety approaches and compared with the EN 1990. The results of the different safety approaches are compared and evaluated. Based on the evaluation tendencies are identified and the differences between the statistical evaluation methods are explained.Keywords: flexural buckling, high strength steel, partial safety factor, statistical evaluation
Procedia PDF Downloads 16015067 Competitive Intelligence within the Maritime Security Intelligence
Authors: Dicky R. Munaf, Ayu Bulan Tisna
Abstract:
Competitive intelligence (business intelligence) is the process of observing the external environment which often conducted by many organizations to get the relevant information which will be used to create the organization policy, whereas, security intelligence is related to the function of the officers who have the duties to protect the country and its people from every criminal actions that might harm the national and individual security. Therefore, the intelligence dimension of maritime security is associated with all the intelligence activities including the subject and the object that connected to the maritime issues. The concept of intelligence business regarding the maritime security perspective is the efforts to protect the maritime security using the analysis of economic movements as the basic strategic plan. Clearly, a weak maritime security will cause high operational cost to all the economic activities which uses the sea as its media. Thus, it affects the competitiveness of a country compared to the other countries that are able to maintain the maritime law enforcement and secure their marine territory. So, the intelligence business within the security intelligence is important to conduct as the beginning process of the identification against the opponent strategy that might happen in the present or in the future. Thereby, the scenario of the potential impact of all the illegal maritime activities, as well as the strategy in preventing the opponent maneuver can be made.Keywords: competitive intelligence, maritime security intelligence, intelligent systems, information technology
Procedia PDF Downloads 50015066 Self-Awareness on Social Work Courses: A Study of Students Perceptions of Teaching Methods in an English University
Authors: Deborah Amas
Abstract:
Global accreditation standards require Higher Education Institutions to ensure social work students develop self-awareness by reflecting on their personal values and critically evaluating how these influence their thinking for professional practice. The knowledge base indicates there are benefits and vulnerabilities for students when they self-reflect and more needs to be understood about the learning environments that nurture self-awareness. The connection between teaching methods and self-awareness is of interest in this paper which reports findings from an on-line survey with students on BA and MA qualifying social work programs in an English university (n=120). Students were asked about the importance of self-awareness and their experiences of teaching methods for self-reflection. Generally, students thought that self-awareness is of high importance in their education. Students also shared stories that illuminated deeper feelings about the potential risks associated with self-disclosure. The findings indicate that students appreciate safe opportunities for self-reflection, but can be wary of associated assessments or feeling judged. The research supports arguments to qualitatively improve facilitation of self-awareness through the curriculum.Keywords: reflection, self-awareness, self-reflection, social work education
Procedia PDF Downloads 30015065 Low Cost Technique for Measuring Luminance in Biological Systems
Abstract:
In this work, the relationship between the melanin content in a tissue and subsequent absorption of light through that tissue was determined using a digital camera. This technique proved to be simple, cost effective, efficient and reliable. Tissue phantom samples were created using milk and soy sauce to simulate the optical properties of melanin content in human tissue. Increasing the concentration of soy sauce in the milk correlated to an increase in melanin content of an individual. Two methods were employed to measure the light transmitted through the sample. The first was direct measurement of the transmitted intensity using a conventional lux meter. The second method involved correctly calibrating an ordinary digital camera and using image analysis software to calculate the transmitted intensity through the phantom. The results from these methods were then graphically compared to the theoretical relationship between the intensity of transmitted light and the concentration of absorbers in the sample. Conclusions were then drawn about the effectiveness and efficiency of these low cost methods.Keywords: tissue phantoms, scattering coefficient, albedo, low-cost method
Procedia PDF Downloads 27115064 Methods for Early Detection of Invasive Plant Species: A Case Study of Hueston Woods State Nature Preserve
Authors: Suzanne Zazycki, Bamidele Osamika, Heather Craska, Kaelyn Conaway, Reena Murphy, Stephanie Spence
Abstract:
Invasive Plant Species (IPS) are an important component of effective preservation and conservation of natural lands management. IPS are non-native plants which can aggressively encroach upon native species and pose a significant threat to the ecology, public health, and social welfare of a community. The presence of IPS in U.S. nature preserves has caused economic costs, which has estimated to exceed $26 billion a year. While different methods have been identified to control IPS, few methods have been recognized for early detection of IPS. This study examined identified methods for early detection of IPS in Hueston Woods State Nature Preserve. Mixed methods research design was adopted in this four-phased study. The first phase entailed data gathering, the phase described the characteristics and qualities of IPS and the importance of early detection (ED). The second phase explored ED methods, Geographic Information Systems (GIS) and Citizen Science were discovered as ED methods for IPS. The third phase of the study involved the creation of hotspot maps to identify likely areas for IPS growth. While the fourth phase involved testing and evaluating mobile applications that can support the efforts of citizen scientists in IPS detection. Literature reviews were conducted on IPS and ED methods, and four regional experts from ODNR and Miami University were interviewed. A questionnaire was used to gather information about ED methods used across the state. The findings revealed that geospatial methods, including Unmanned Aerial Vehicles (UAVs), Multispectral Satellites (MSS), and Normalized Difference Vegetation Index (NDVI), are not feasible for early detection of IPS, as they require GIS expertise, are still an emerging technology, and are not suitable for every habitat for the ED of IPS. Therefore, Other ED methods options were explored, which include predicting areas where IPS will grow, which can be done through monitoring areas that are like the species’ native habitat. Through literature review and interviews, IPS are known to grow in frequently disturbed areas such as along trails, shorelines, and streambanks. The research team called these areas “hotspots” and created maps of these hotspots specifically for HW NP to support and narrow the efforts of citizen scientists and staff in the ED of IPS. The results further showed that utilizing citizen scientists in the ED of IPS is feasible, especially through single day events or passive monitoring challenges. The study concluded that the creation of hotspot maps to direct the efforts of citizen scientists are effective for the early detection of IPS. Several recommendations were made, among which is the creation of hotspot maps to narrow the ED efforts as citizen scientists continues to work in the preserves and utilize citizen science volunteers to identify and record emerging IPS.Keywords: early detection, hueston woods state nature preserve, invasive plant species, hotspots
Procedia PDF Downloads 10315063 Transforming Personal Healthcare through Patient Engagement: An In-Depth Analysis of Tools and Methods for the Digital Age
Authors: Emily Hickmann, Peggy Richter, Maren Kaehlig, Hannes Schlieter
Abstract:
Patient engagement is a cornerstone of high-quality care and essential for patients with chronic diseases to achieve improved health outcomes. Through digital transformation, possibilities to engage patients in their personal healthcare have multiplied. However, the exploitation of this potential is still lagging. To support the transmission of patient engagement theory into practice, this paper’s objective is to give a state-of-the-art overview of patient engagement tools and methods. A systematic literature review was conducted. Overall, 56 tools and methods were extracted and synthesized according to the four attributes of patient engagement, i.e., personalization, access, commitment, and therapeutic alliance. The results are discussed in terms of their potential to be implemented in digital health solutions under consideration of the “computers are social actors” (CASA) paradigm. It is concluded that digital health can catalyze patient engagement in practice, and a broad future research agenda is formulated.Keywords: chronic diseases, digitalization, patient-centeredness, patient empowerment, patient engagement
Procedia PDF Downloads 11715062 AI Tutor: A Computer Science Domain Knowledge Graph-Based QA System on JADE platform
Authors: Yingqi Cui, Changran Huang, Raymond Lee
Abstract:
In this paper, we proposed an AI Tutor using ontology and natural language process techniques to generate a computer science domain knowledge graph and answer users’ questions based on the knowledge graph. We define eight types of relation to extract relationships between entities according to the computer science domain text. The AI tutor is separated into two agents: learning agent and Question-Answer (QA) agent and developed on JADE (a multi-agent system) platform. The learning agent is responsible for reading text to extract information and generate a corresponding knowledge graph by defined patterns. The QA agent can understand the users’ questions and answer humans’ questions based on the knowledge graph generated by the learning agent.Keywords: artificial intelligence, natural Language processing, knowledge graph, intelligent agents, QA system
Procedia PDF Downloads 18715061 Lessons Learned from Interlaboratory Noise Modelling in Scope of Environmental Impact Assessments in Slovenia
Abstract:
Noise assessment methods are regularly used in scope of Environmental Impact Assessments for planned projects to assess (predict) the expected noise emissions of these projects. Different noise assessment methods could be used. In recent years, we had an opportunity to collaborate in some noise assessment procedures where noise assessments of different laboratories have been performed simultaneously. We identified some significant differences in noise assessment results between laboratories in Slovenia. We estimate that despite good input Georeferenced Data to set up acoustic model exists in Slovenia; there is no clear consensus on methods for predictive noise methods for planned projects. We analyzed input data, methods and results of predictive noise methods for two planned industrial projects, both were done independently by two laboratories. We also analyzed the data, methods and results of two interlaboratory collaborative noise models for two existing noise sources (railway and motorway). In cases of predictive noise modelling, the validations of acoustic models were performed by noise measurements of surrounding existing noise sources, but in varying durations. The acoustic characteristics of existing buildings were also not described identically. The planned noise sources were described and digitized differently. Differences in noise assessment results between different laboratories have ranged up to 10 dBA, which considerably exceeds the acceptable uncertainty ranged between 3 to 6 dBA. Contrary to predictive noise modelling, in cases of collaborative noise modelling for two existing noise sources the possibility to perform the validation noise measurements of existing noise sources greatly increased the comparability of noise modelling results. In both cases of collaborative noise modelling for existing motorway and railway, the modelling results of different laboratories were comparable. Differences in noise modeling results between different laboratories were below 5 dBA, which was acceptable uncertainty set up by interlaboratory noise modelling organizer. The lessons learned from the study were: 1) Predictive noise calculation using formulae from International standard SIST ISO 9613-2: 1997 is not an appropriate method to predict noise emissions of planned projects since due to complexity of procedure they are not used strictly, 2) The noise measurements are important tools to minimize noise assessment errors of planned projects and should be in cases of predictive noise modelling performed at least for validation of acoustic model, 3) National guidelines should be made on the appropriate data, methods, noise source digitalization, validation of acoustic model etc. in order to unify the predictive noise models and their results in scope of Environmental Impact Assessments for planned projects.Keywords: environmental noise assessment, predictive noise modelling, spatial planning, noise measurements, national guidelines
Procedia PDF Downloads 23415060 French Language Teaching in Nigeria and Future with Technology
Authors: Chidiebere Samuel Ijeoma
Abstract:
The impact and importance of technology in all domains of existence cannot be overemphasized. It is like a double-edged sword which can be both constructive and destructive. The paper, therefore, tends to evaluate the impact of technology so far in the teaching and learning of French language in Nigeria. According to the study, the traditional methods of teaching French as a Foreign Language and recognized as our cultural methods of knowledge transfer are being fast replaced by digitalization in teaching. This, the research tends to portray and suggest the best way forward. In the Nigerian Primary Education System, the use of some local and cultural Instructional materials (teaching aids) is now almost history which the paper frowns at. Consequently, the study has these questions to ask?; Where are the chalks and blackboards? Where are the ‘Handworks’ (local brooms) submitted by school children as part of their Continuous Assessment? Finally, the research is in no way against the application of technology in the Nigerian French Language Teaching System but tries to draw a curtain between Technological methods of teaching French as a Foreign Language and the Original Nigerian System of teaching the language before the arrival of technology.Keywords: French language teaching, future, impact, importance of technology
Procedia PDF Downloads 35615059 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission
Authors: Tingwei Shu, Dong Zhou, Chengjun Guo
Abstract:
Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.Keywords: semantic communication, transformer, wavelet transform, data processing
Procedia PDF Downloads 78