Search results for: software assessment approach
20616 Enhanced Face Recognition with Daisy Descriptors Using 1BT Based Registration
Authors: Sevil Igit, Merve Meric, Sarp Erturk
Abstract:
In this paper, it is proposed to improve Daisy descriptor based face recognition using a novel One-Bit Transform (1BT) based pre-registration approach. The 1BT based pre-registration procedure is fast and has low computational complexity. It is shown that the face recognition accuracy is improved with the proposed approach. The proposed approach can facilitate highly accurate face recognition using DAISY descriptor with simple matching and thereby facilitate a low-complexity approach.Keywords: face recognition, Daisy descriptor, One-Bit Transform, image registration
Procedia PDF Downloads 36720615 Learning Made Right: Building World Class Engineers in Tunisia
Authors: Zayen Chagra
Abstract:
Several educational institutions are experimenting new approaches in learning in order to guarantee the success of its students. In Tunisia, and since 2011, the experience of making a new software engineering branch called mobile software engineering began at ESPRIT: Higher School of Engineering and Technology. The project was surprisingly a success since its creation, and even before the graduation of the first generation, partnerships were held with the biggest mobile technology manufacturers and several international awards were won by teams of students. This session presents this experience with details of the approaches made from idea stage to the actual stage where the project counts 32 graduated engineers, 90 graduate students and 120 new participants.Keywords: innovation, education, engineering education, mobile
Procedia PDF Downloads 42620614 Single-Element Simulations of Wood Material in LS-DYNA
Authors: Ren Zuo Wang
Abstract:
In this paper, in order to investigate the behavior of the wood structure, the non-linearity of wood material model in LS-DYNA is adopted. It is difficult and less efficient to conduct the experiment of the ancient wood structure, hence LS-DYNA software can be used to simulate nonlinear responses of ancient wood structure. In LS-DYNA software, there is material model called *MAT_WOOD or *MAT_143. This model is to simulate a single-element response of the wood subjected to tension and compression under the parallel and the perpendicular material directions. Comparing with the exact solution and numerical simulations results using LS-DYNA, it demonstrates the accuracy and the efficiency of the proposed simulation method.Keywords: LS-DYNA, wood structure, single-element simulations, MAT_143
Procedia PDF Downloads 65420613 The Development and Provision of a Knowledge Management Ecosystem, Optimized for Genomics
Authors: Matthew I. Bellgard
Abstract:
The field of bioinformatics has made, and continues to make, substantial progress and contributions to life science research and development. However, this paper contends that a systems approach integrates bioinformatics activities for any project in a defined manner. The application of critical control points in this bioinformatics systems approach may be useful to identify and evaluate points in a pathway where specified activity risk can be reduced, monitored and quality enhanced.Keywords: bioinformatics, food security, personalized medicine, systems approach
Procedia PDF Downloads 42320612 Implementing Simulation-Based Education as a Transformative Learning Strategy in Nursing and Midwifery Curricula in Resource-Constrained Countries: The Case of Malawi
Authors: Patrick Mapulanga, Chisomo Petros Ganya
Abstract:
Purpose: This study aimed to investigate the integration of Simulation-Based Education (SBE) into nursing and midwifery curricula in resource-constrained countries using Malawi as a case study. The purpose of this study is to assess the extent to which SBE is mentioned in curricula and explore the associated content, assessment criteria, and guidelines. Methodology: The research methodology involved a desk study of nursing and midwifery curricula in Malawi. A comprehensive review was conducted to identify references to SBE by examining documents such as official curriculum guides, syllabi, and educational policies. The focus is on understanding the prevalence of SBE without delving into the specific content or assessment details. Findings: The findings revealed that SBE is indeed mentioned in the nursing and midwifery curricula in Malawi; however, there is a notable absence of detailed content and assessment criteria. While acknowledgement of SBE is a positive step, the lack of specific guidelines poses a challenge to its effective implementation and assessment within the educational framework. Conclusion: The study concludes that although the recognition of SBE in Malawian nursing and midwifery curricula signifies a potential openness to innovative learning strategies, the absence of detailed content and assessment criteria raises concerns about the practical application of SBE. Addressing this gap is crucial for harnessing the full transformative potential of SBE in resource-constrained environments. Areas for Further Research: Future research endeavours should focus on a more in-depth exploration of the content and assessment criteria related to SBE in nursing and midwifery curricula. Investigating faculty perspectives and students’ experiences with SBE could provide valuable insights into the challenges and opportunities associated with its implementation. Study Limitations and Implications: The study's limitations include reliance on desk-based analysis, which limits the depth of understanding regarding SBE implementation. Despite this constraint, the implications of the findings underscore the need for curriculum developers, educators, and policymakers to collaboratively address the gaps in SBE integration and ensure a comprehensive and effective learning experience for nursing and midwifery students in resource-constrained countries.Keywords: simulation based education, transformative learning, nursing and midwifery, curricula, Malawi
Procedia PDF Downloads 6820611 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R
Authors: Pavel H. Llamocca, Victoria Lopez
Abstract:
The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.Keywords: open data, R language, data integration, environmental data
Procedia PDF Downloads 31520610 A Note on the Fractal Dimension of Mandelbrot Set and Julia Sets in Misiurewicz Points
Authors: O. Boussoufi, K. Lamrini Uahabi, M. Atounti
Abstract:
The main purpose of this paper is to calculate the fractal dimension of some Julia Sets and Mandelbrot Set in the Misiurewicz Points. Using Matlab to generate the Julia Sets images that match the Misiurewicz points and using a Fractal software, we were able to find different measures that characterize those fractals in textures and other features. We are actually focusing on fractal dimension and the error calculated by the software. When executing the given equation of regression or the log-log slope of image a Box Counting method is applied to the entire image, and chosen settings are available in a FracLAc Program. Finally, a comparison is done for each image corresponding to the area (boundary) where Misiurewicz Point is located.Keywords: box counting, FracLac, fractal dimension, Julia Sets, Mandelbrot Set, Misiurewicz Points
Procedia PDF Downloads 21620609 Measuring Delay Using Software Defined Networks: Limitations, Challenges, and Suggestions for Openflow
Authors: Ahmed Alutaibi, Ganti Sudhakar
Abstract:
Providing better Quality-of-Service (QoS) to end users has been a challenging problem for researchers and service providers. Building applications relying on best effort network protocols hindered the adoption of guaranteed service parameters and, ultimately, Quality of Service. The introduction of Software Defined Networking (SDN) opened the door for a new paradigm shift towards a more controlled programmable configurable behavior. Openflow has been and still is the main implementation of the SDN vision. To facilitate better QoS for applications, the network must calculate and measure certain parameters. One of those parameters is the delay between the two ends of the connection. Using the power of SDN and the knowledge of application and network behavior, SDN networks can adjust to different conditions and specifications. In this paper, we use the capabilities of SDN to implement multiple algorithms to measure delay end-to-end not only inside the SDN network. The results of applying the algorithms on an emulated environment show that we can get measurements close to the emulated delay. The results also show that depending on the algorithm, load on the network and controller can differ. In addition, the transport layer handshake algorithm performs best among the tested algorithms. Out of the results and implementation, we show the limitations of Openflow and develop suggestions to solve them.Keywords: software defined networking, quality of service, delay measurement, openflow, mininet
Procedia PDF Downloads 16520608 Autonomous Vehicle Detection and Classification in High Resolution Satellite Imagery
Authors: Ali J. Ghandour, Houssam A. Krayem, Abedelkarim A. Jezzini
Abstract:
High-resolution satellite images and remote sensing can provide global information in a fast way compared to traditional methods of data collection. Under such high resolution, a road is not a thin line anymore. Objects such as cars and trees are easily identifiable. Automatic vehicles enumeration can be considered one of the most important applications in traffic management. In this paper, autonomous vehicle detection and classification approach in highway environment is proposed. This approach consists mainly of three stages: (i) first, a set of preprocessing operations are applied including soil, vegetation, water suppression. (ii) Then, road networks detection and delineation is implemented using built-up area index, followed by several morphological operations. This step plays an important role in increasing the overall detection accuracy since vehicles candidates are objects contained within the road networks only. (iii) Multi-level Otsu segmentation is implemented in the last stage, resulting in vehicle detection and classification, where detected vehicles are classified into cars and trucks. Accuracy assessment analysis is conducted over different study areas to show the great efficiency of the proposed method, especially in highway environment.Keywords: remote sensing, object identification, vehicle and road extraction, vehicle and road features-based classification
Procedia PDF Downloads 23220607 Indicators to Assess the Quality of Health Services
Authors: Muyatdinova Aigul, Aitkaliyeva Madina
Abstract:
The article deals with the evaluation of the quality of medical services on the basis of quality indicators. For this purpose allocated initially the features of the medical services market. The Features of the market directly affect on the evaluation process that takes a multi-level and multi-stakeholder nature. Unlike ordinary goods market assessment of medical services does not only market. Such an assessment is complemented by continuous internal and external evaluation, including experts and accrediting bodies. In the article highlighted the composition of indicators for a comprehensive evaluationKeywords: health care market, quality of health services, indicators of care quality
Procedia PDF Downloads 43720606 Reception Class Practitioners' Understandings on the Role of Teaching Assistants, in Particular Supporting Children in Mathematics
Authors: Nursel Bektas
Abstract:
The purpose of this study is to investigate the roles of teaching assistants (TAs) working in reception classes through practitioners’ perspectives. The study has two major purposes; firstly to explore the general roles of TAs, and secondly to identify their roles in supporting children for mathematics. A small-scale case study approach was adopted for this study. The research was carried out in two reception classes within a primary school in London. The qualitative data were gathered through observations and semi-structured interviews with four reception class practitioners, comprising two teachers and two TAs. The results show that TAs consider their role to be more like a teacher, whereas classroom teachers do not corroborate this and they generally believe that the role of TAs depends on their personal characteristics and skills. In regard to the general role of TAs, the study suggests that reception class TAs are deployed both at the classroom level to provide academic support for children’s learning and development, and at the school level they are deployed as support staff such as Midday Meal Supervisor or assistants. In terms of the pedagogical roles of TAs, it was found that TAs have a strong teaching role in literacy development, with notable autonomy if conducting their own phonics sessions without teacher direction, but a negligible influence in numeracy/ math’s. In addition, the results show that the TA role is perceived to be quite limited in planning and assessment processes. Linked to their limited roles in such processes, all participants agree that all the responsibility regarding the children’s learning and development, planning and assessment lies with the teacher. Therefore, data suggest that TAs’ roles in these areas depend on TAs’ their own initiatives.Keywords: early years education, reception classes, roles, teaching assistants
Procedia PDF Downloads 18620605 Evaluation of the Internal Quality for Pineapple Based on the Spectroscopy Approach and Neural Network
Authors: Nonlapun Meenil, Pisitpong Intarapong, Thitima Wongsheree, Pranchalee Samanpiboon
Abstract:
In Thailand, once pineapples are harvested, they must be classified into two classes based on their sweetness: sweet and unsweet. This paper has studied and developed the assessment of internal quality of pineapples using a low-cost compact spectroscopy sensor according to the Spectroscopy approach and Neural Network (NN). During the experiments, Batavia pineapples were utilized, generating 100 samples. The extracted pineapple juice of each sample was used to determine the Soluble Solid Content (SSC) labeling into sweet and unsweet classes. In terms of experimental equipment, the sensor cover was specifically designed to install the sensor and light source to read the reflectance at a five mm depth from pineapple flesh. By using a spectroscopy sensor, data on visible and near-infrared reflectance (Vis-NIR) were collected. The NN was used to classify the pineapple classes. Before the classification step, the preprocessing methods, which are Class balancing, Data shuffling, and Standardization were applied. The 510 nm and 900 nm reflectance values of the middle parts of pineapples were used as features of the NN. With the Sequential model and Relu activation function, 100% accuracy of the training set and 76.67% accuracy of the test set were achieved. According to the abovementioned information, using a low-cost compact spectroscopy sensor has achieved favorable results in classifying the sweetness of the two classes of pineapples.Keywords: neural network, pineapple, soluble solid content, spectroscopy
Procedia PDF Downloads 7520604 Symo-syl: A Meta-Phonological Intervention to Support Italian Pre-Schoolers’ Emergent Literacy Skills
Authors: Tamara Bastianello, Rachele Ferrari, Marinella Majorano
Abstract:
The adoption of the syllabic approach in preschool programmes could support and reinforce meta-phonological awareness and literacy skills in children. The introduction of a meta-phonological intervention in preschool could facilitate the transition to primary school, especially for children with learning fragilities. In the present contribution, we want to investigate the efficacy of "Simo-syl" intervention in enhancing emergent literacy skills in children (especially for reading). Simo-syl is a 12 weeks multimedia programme developed for children to improve their language and communication skills and later literacy development in preschool. During the intervention, Simo-syl, an invented character, leads children in a series of meta-phonological games. Forty-six Italian preschool children (i.e., the Simo-syl group) participated in the programme; seventeen preschool children (i.e., the control group) did not participate in the intervention. Children in the two groups were between 4;10 and 5;9 years. They were assessed on their vocabulary, morpho-syntactical, meta-phonological, phonological, and phono-articulatory skills twice: 1) at the beginning of the last year of the preschool through standardised paper-based assessment tools and 2) one week after the intervention. All children in the Simo-syl group took part in the meta-phonological programme based on the syllabic approach. The intervention lasted 12 weeks (three activities per week; week 1: activities focused on syllable blending and spelling and a first approach to the written code; weeks 2-11: activities focused on syllables recognition; week 12: activities focused on vowels recognition). Very few children (Simo-syl group = 21, control group = 9) were tested again (post-test) one week after the intervention. Before starting the intervention programme, the Simo-syl and the control groups had similar meta-phonological, phonological, lexical skills (all ps > .05). One week after the intervention, a significant difference emerged between the two groups in their meta-phonological skills (syllable blending, p = .029; syllable spelling, p = .032), in their vowel recognition ability (p = .032) and their word reading skills (p = .05). An ANOVA confirmed the effect of the group membership on the developmental growth for the word reading task (F (1,28) = 6.83, p = .014, ηp2 = .196). Taking part in the Simo-syl intervention has a positive effect on the ability to read in preschool children.Keywords: intervention programme, literacy skills, meta-phonological skills, syllabic approach
Procedia PDF Downloads 16220603 Security in Cyberspace: A Comprehensive Review of COVID-19 Continued Effects on Security Threats and Solutions in 2021 and the Trajectory of Cybersecurity Going into 2022
Authors: Mojtaba Fayaz, Richard Hallal
Abstract:
This study examines the various types of dangers that our virtual environment is vulnerable to, including how it can be attacked and how to avoid/secure our data. The terrain of cyberspace is never completely safe, and Covid- 19 has added to the confusion, necessitating daily periodic checks and evaluations. Cybercriminals have been able to enact with greater skill and undertake more conspicuous and sophisticated attacks while keeping a higher level of finesse by operating from home. Different types of cyberattacks, such as operation-based attacks, authentication-based attacks, and software-based attacks, are constantly evolving, but research suggests that software-based threats, such as Ransomware, are becoming more popular, with attacks expected to increase by 93 percent by 2020. The effectiveness of cyber frameworks has shifted dramatically as the pandemic has forced work and private life to become intertwined, destabilising security overall and creating a new front of cyber protection for security analysis and personal. The high-rise formats in which cybercrimes are carried out, as well as the types of cybercrimes that exist, such as phishing, identity theft, malware, and DDoS attacks, have created a new front of cyber protection for security analysis and personal safety. The overall strategy for 2022 will be the introduction of frameworks that address many of the issues associated with offsite working, as well as education that provides better information about commercialised software that does not provide the highest level of security for home users, allowing businesses to plan better security around their systems.Keywords: cyber security, authentication, software, hardware, malware, COVID-19, threat actors, awareness, home users, confidentiality, integrity, availability, attacks
Procedia PDF Downloads 11620602 Modeling of Glycine Transporters in Mammalian Using the Probability Approach
Authors: K. S. Zaytsev, Y. R. Nartsissov
Abstract:
Glycine is one of the key inhibitory neurotransmitters in Central nervous system (CNS) meanwhile glycinergic transmission is highly dependable on its appropriate reuptake from synaptic cleft. Glycine transporters (GlyT) of types 1 and 2 are the enzymes providing glycine transport back to neuronal and glial cells along with Na⁺ and Cl⁻ co-transport. The distribution and stoichiometry of GlyT1 and GlyT2 differ in details, and GlyT2 is more interesting for the research as it reuptakes glycine to neuron cells, whereas GlyT1 is located in glial cells. In the process of GlyT2 activity, the translocation of the amino acid is accompanied with binding of both one chloride and three sodium ions consequently (two sodium ions for GlyT1). In the present study, we developed a computer simulator of GlyT2 and GlyT1 activity based on known experimental data for quantitative estimation of membrane glycine transport. The trait of a single protein functioning was described using the probability approach where each enzyme state was considered separately. Created scheme of transporter functioning realized as a consequence of elemental steps allowed to take into account each event of substrate association and dissociation. Computer experiments using up-to-date kinetic parameters allowed receiving the number of translocated glycine molecules, Na⁺ and Cl⁻ ions per time period. Flexibility of developed software makes it possible to evaluate glycine reuptake pattern in time under different internal characteristics of enzyme conformational transitions. We investigated the behavior of the system in a wide range of equilibrium constant (from 0.2 to 100), which is not determined experimentally. The significant influence of equilibrium constant in the range from 0.2 to 10 on the glycine transfer process is shown. The environmental conditions such as ion and glycine concentrations are decisive if the values of the constant are outside the specified range.Keywords: glycine, inhibitory neurotransmitters, probability approach, single protein functioning
Procedia PDF Downloads 11920601 Rehabilitation of Orthotropic Steel Deck Bridges Using a Modified Ortho-Composite Deck System
Authors: Mozhdeh Shirinzadeh, Richard Stroetmann
Abstract:
Orthotropic steel deck bridge consists of a deck plate, longitudinal stiffeners under the deck plate, cross beams and the main longitudinal girders. Due to the several advantages, Orthotropic Steel Deck (OSD) systems have been utilized in many bridges worldwide. The significant feature of this structural system is its high load-bearing capacity while having relatively low dead weight. In addition, cost efficiency and the ability of rapid field erection have made the orthotropic steel deck a popular type of bridge worldwide. However, OSD bridges are highly susceptible to fatigue damage. A large number of welded joints can be regarded as the main weakness of this system. This problem is, in particular, evident in the bridges which were built before 1994 when the fatigue design criteria had not been introduced in the bridge design codes. Recently, an Orthotropic-composite slab (OCS) for road bridges has been experimentally and numerically evaluated and developed at Technische Universität Dresden as a part of AIF-FOSTA research project P1265. The results of the project have provided a solid foundation for the design and analysis of Orthotropic-composite decks with dowel strips as a durable alternative to conventional steel or reinforced concrete decks. In continuation, while using the achievements of that project, the application of a modified Ortho-composite deck for an existing typical OSD bridge is investigated. Composite action is obtained by using rows of dowel strips in a clothoid (CL) shape. Regarding Eurocode criteria for different fatigue detail categories of an OSD bridge, the effect of the proposed modification approach is assessed. Moreover, a numerical parametric study is carried out utilizing finite element software to determine the impact of different variables, such as the size and arrangement of dowel strips, the application of transverse or longitudinal rows of dowel strips, and local wheel loads. For the verification of the simulation technique, experimental results of a segment of an OCS deck are used conducted in project P1265. Fatigue assessment is performed based on the last draft of Eurocode 1993-2 (2024) for the most probable detail categories (Hot-Spots) that have been reported in the previous statistical studies. Then, an analytical comparison is provided between the typical orthotropic steel deck and the modified Ortho-composite deck bridge in terms of fatigue issues and durability. The load-bearing capacity of the bridge, the critical deflections, and the composite behavior are also evaluated and compared. Results give a comprehensive overview of the efficiency of the rehabilitation method considering the required design service life of the bridge. Moreover, the proposed approach is assessed with regard to the construction method, details and practical aspects, as well as the economic point of view.Keywords: composite action, fatigue, finite element method, steel deck, bridge
Procedia PDF Downloads 8420600 Development of a Geomechanical Risk Assessment Model for Underground Openings
Authors: Ali Mortazavi
Abstract:
The main objective of this research project is to delve into a multitude of geomechanical risks associated with various mining methods employed within the underground mining industry. Controlling geotechnical design parameters and operational factors affecting the selection of suitable mining techniques for a given underground mining condition will be considered from a risk assessment point of view. Important geomechanical challenges will be investigated as appropriate and relevant to the commonly used underground mining methods. Given the complicated nature of rock mass in-situ and complicated boundary conditions and operational complexities associated with various underground mining methods, the selection of a safe and economic mining operation is of paramount significance. Rock failure at varying scales within the underground mining openings is always a threat to mining operations and causes human and capital losses worldwide. Geotechnical design is a major design component of all underground mines and basically dominates the safety of an underground mine. With regard to uncertainties that exist in rock characterization prior to mine development, there are always risks associated with inappropriate design as a function of mining conditions and the selected mining method. Uncertainty often results from the inherent variability of rock masse, which in turn is a function of both geological materials and rock mass in-situ conditions. The focus of this research is on developing a methodology which enables a geomechanical risk assessment of given underground mining conditions. The outcome of this research is a geotechnical risk analysis algorithm, which can be used as an aid in selecting the appropriate mining method as a function of mine design parameters (e.g., rock in-situ properties, design method, governing boundary conditions such as in-situ stress and groundwater, etc.).Keywords: geomechanical risk assessment, rock mechanics, underground mining, rock engineering
Procedia PDF Downloads 14520599 Use of a Business Intelligence Software for Interactive Visualization of Data on the Swiss Elite Sports System
Authors: Corinne Zurmuehle, Andreas Christoph Weber
Abstract:
In 2019, the Swiss Federal Institute of Sport Magglingen (SFISM) conducted a mixed-methods study on the Swiss elite sports system, which yielded a large quantity of research data. In a quantitative online survey, 1151 elite sports athletes, 542 coaches, and 102 Performance Directors of national sports federations (NF) have submitted their perceptions of the national support measures of the Swiss elite sports system. These data provide an essential database for the further development of the Swiss elite sports system. The results were published in a report presenting the results divided into 40 Olympic summer and 14 winter sports (Olympic classification). The authors of this paper assume that, in practice, this division is too unspecific to assess where further measures would be needed. The aim of this paper is to find appropriate parameters for data visualization in order to identify disparities in sports promotion that allow an assessment of where further interventions by Swiss Olympic (NF umbrella organization) are required. Method: First, the variable 'salary earned from sport' was defined as a variable to measure the impact of elite sports promotion. This variable was chosen as a measure as it represents an important indicator for the professionalization of elite athletes and therefore reflects national level sports promotion measures applied by Swiss Olympic. Afterwards, the variable salary was tested with regard to the correlation between Olympic classification [a], calculating the Eta coefficient. To estimate the appropriate parameters for data visualization, the correlation between salary and four further parameters was analyzed by calculating the Eta coefficient: [a] sport; [b] prioritization (from 1 to 5) of the sports by Swiss Olympic; [c] gender; [d] employment level in sports. Results & Discussion: The analyses reveal a very small correlation between salary and Olympic classification (ɳ² = .011, p = .005). Gender demonstrates an even small correlation (ɳ² = .006, p = .014). The parameter prioritization was correlating with small effect (ɳ² = .017, p = .001) as did employment level (ɳ² = .028, p < .001). The highest correlation was identified by the parameter sport with a moderate effect (ɳ² = .075, p = .047). The analyses show that the disparities in sports promotion cannot be determined by a particular parameter but presumably explained by a combination of several parameters. We argue that the possibility of combining parameters for data visualization should be enabled when the analysis is provided to Swiss Olympic for further strategic decision-making. However, the inclusion of multiple parameters massively multiplies the number of graphs and is therefore not suitable for practical use. Therefore, we suggest to apply interactive dashboards for data visualization using Business Intelligence Software. Practical & Theoretical Contribution: This contribution provides the first attempt to use Business Intelligence Software for strategic decision-making in national level sports regarding the prioritization of national resources for sports and athletes. This allows to set specific parameters with a significant effect as filters. By using filters, parameters can be combined and compared against each other and set individually for each strategic decision.Keywords: data visualization, business intelligence, Swiss elite sports system, strategic decision-making
Procedia PDF Downloads 9020598 Seismic Fragility Curves Methodologies for Bridges: A Review
Authors: Amirmozafar Benshams, Khatere Kashmari, Farzad Hatami, Mesbah Saybani
Abstract:
As a part of the transportation network, bridges are one of the most vulnerable structures. In order to investigate the vulnerability and seismic evaluation of bridges performance, identifying of bridge associated with various state of damage is important. Fragility curves provide important data about damage states and performance of bridges against earthquakes. The development of vulnerability information in the form of fragility curves is a widely practiced approach when the information is to be developed accounting for a multitude of uncertain source involved. This paper presents the fragility curve methodologies for bridges and investigates the practice and applications relating to the seismic fragility assessment of bridges.Keywords: fragility curve, bridge, uncertainty, NLTHA, IDA
Procedia PDF Downloads 28220597 The Trigger-DAQ System in the Mu2e Experiment
Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella
Abstract:
The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).Keywords: trigger, daq, mu2e, Fermilab
Procedia PDF Downloads 15520596 A Soft Computing Approach Monitoring of Heavy Metals in Soil and Vegetables in the Republic of Macedonia
Authors: Vesna Karapetkovska Hristova, M. Ayaz Ahmad, Julijana Tomovska, Biljana Bogdanova Popov, Blagojce Najdovski
Abstract:
The average total concentrations of heavy metals; (cadmium [Cd], copper [Cu], nickel [Ni], lead [Pb], and zinc [Zn]) were analyzed in soil and vegetables samples collected from the different region of Macedonia during the years 2010-2012. Basic soil properties such as pH, organic matter and clay content were also included in the study. The average concentrations of Cd, Cu, Ni, Pb, Zn in the A horizon (0-30 cm) of agricultural soils were as follows, respectively: 0.25, 5.3, 6.9, 15.2, 26.3 mg kg-1 of soil. We have found that neural networking model can be considered as a tool for prediction and spatial analysis of the processes controlling the metal transfer within the soil-and vegetables. The predictive ability of such models is well over 80% as compared to 20% for typical regression models. A radial basic function network reflects good predicting accuracy and correlation coefficients between soil properties and metal content in vegetables much better than the back-propagation method. Neural Networking / soft computing can support the decision-making processes at different levels, including agro ecology, to improve crop management based on monitoring data and risk assessment of metal transfer from soils to vegetables.Keywords: soft computing approach, total concentrations, heavy metals, agricultural soils
Procedia PDF Downloads 36820595 Case-Based Reasoning for Modelling Random Variables in the Reliability Assessment of Existing Structures
Authors: Francesca Marsili
Abstract:
The reliability assessment of existing structures with probabilistic methods is becoming an increasingly important and frequent engineering task. However probabilistic reliability methods are based on an exhaustive knowledge of the stochastic modeling of the variables involved in the assessment; at the moment standards for the modeling of variables are absent, representing an obstacle to the dissemination of probabilistic methods. The framework according to probability distribution functions (PDFs) are established is represented by the Bayesian statistics, which uses Bayes Theorem: a prior PDF for the considered parameter is established based on information derived from the design stage and qualitative judgments based on the engineer past experience; then, the prior model is updated with the results of investigation carried out on the considered structure, such as material testing, determination of action and structural properties. The application of Bayesian statistics arises two different kind of problems: 1. The results of the updating depend on the engineer previous experience; 2. The updating of the prior PDF can be performed only if the structure has been tested, and quantitative data that can be statistically manipulated have been collected; performing tests is always an expensive and time consuming operation; furthermore, if the considered structure is an ancient building, destructive tests could compromise its cultural value and therefore should be avoided. In order to solve those problems, an interesting research path is represented by investigating Artificial Intelligence (AI) techniques that can be useful for the automation of the modeling of variables and for the updating of material parameters without performing destructive tests. Among the others, one that raises particular attention in relation to the object of this study is constituted by Case-Based Reasoning (CBR). In this application, cases will be represented by existing buildings where material tests have already been carried out and an updated PDFs for the material mechanical parameters has been computed through a Bayesian analysis. Then each case will be composed by a qualitative description of the material under assessment and the posterior PDFs that describe its material properties. The problem that will be solved is the definition of PDFs for material parameters involved in the reliability assessment of the considered structure. A CBR system represent a good candi¬date in automating the modelling of variables because: 1. Engineers already draw an estimation of the material properties based on the experience collected during the assessment of similar structures, or based on similar cases collected in literature or in data-bases; 2. Material tests carried out on structure can be easily collected from laboratory database or from literature; 3. The system will provide the user of a reliable probabilistic description of the variables involved in the assessment that will also serve as a tool in support of the engineer’s qualitative judgments. Automated modeling of variables can help in spreading probabilistic reliability assessment of existing buildings in the common engineering practice, and target at the best intervention and further tests on the structure; CBR represents a technique which may help to achieve this.Keywords: reliability assessment of existing buildings, Bayesian analysis, case-based reasoning, historical structures
Procedia PDF Downloads 33720594 Economic Assessment of CO2-Based Methane, Methanol and Polyoxymethylene Production
Authors: Wieland Hoppe, Nadine Wachter, Stefan Bringezu
Abstract:
Carbon dioxide (CO2) utilization might be a promising way to substitute fossil raw materials like coal, oil or natural gas as carbon source of chemical production. While first life cycle assessments indicate a positive environmental performance of CO2-based process routes, a commercialization of CO2 is limited by several economic obstacles up to now. We, therefore, analyzed the economic performance of the three CO2-based chemicals methane and methanol as basic chemicals and polyoxymethylene as polymer on a cradle-to-gate basis. Our approach is oriented towards life cycle costing. The focus lies on the cost drivers of CO2-based technologies and options to stimulate a CO2-based economy by changing regulative factors. In this way, we analyze various modes of operation and give an outlook for the potentially cost-effective development in the next decades. Biogas, waste gases of a cement plant, and flue gases of a waste incineration plant are considered as CO2-sources. The energy needed to convert CO2 into hydrocarbons via electrolysis is assumed to be supplied by wind power, which is increasingly available in Germany. Economic data originates from both industrial processes and process simulations. The results indicate that CO2-based production technologies are not competitive with conventional production methods under present conditions. This is mainly due to high electricity generation costs and regulative factors like the German Renewable Energy Act (EEG). While the decrease in production costs of CO2-based chemicals might be limited in the next decades, a modification of relevant regulative factors could potentially promote an earlier commercialization.Keywords: carbon capture and utilization (CCU), economic assessment, life cycle costing (LCC), power-to-X
Procedia PDF Downloads 29120593 A Common Automated Programming Platform for Knowledge Based Software Engineering
Authors: Ivan Stanev, Maria Koleva
Abstract:
A common platform for automated programming (CPAP) is defined in details. Two versions of CPAP are described: Cloud-based (including the set of components for classic programming, and the set of components for combined programming) and KBASE based (including the set of components for automated programming, and the set of components for ontology programming). Four KBASE products (module for automated programming of robots, intelligent product manual, intelligent document display, and intelligent form generator) are analyzed and CPAP contributions to automated programming are presented.Keywords: automated programming, cloud computing, knowledge based software engineering, service oriented architecture
Procedia PDF Downloads 34420592 Edge Enhancement Visual Methodology for Fat Amount and Distribution Assessment in Dry-Cured Ham Slices
Authors: Silvia Grassi, Stefano Schiavon, Ernestina Casiraghi, Cristina Alamprese
Abstract:
Dry-cured ham is an uncooked meat product particularly appreciated for its peculiar sensory traits among which lipid component plays a key role in defining quality and, consequently, consumers’ acceptability. Usually, fat content and distribution are chemically determined by expensive, time-consuming, and destructive analyses. Moreover, different sensory techniques are applied to assess product conformity to desired standards. In this context, visual systems are getting a foothold in the meat market envisioning more reliable and time-saving assessment of food quality traits. The present work aims at developing a simple but systematic and objective visual methodology to assess the fat amount of dry-cured ham slices, in terms of total, intermuscular and intramuscular fractions. To the aim, 160 slices from 80 PDO dry-cured hams were evaluated by digital image analysis and Soxhlet extraction. RGB images were captured by a flatbed scanner, converted in grey-scale images, and segmented based on intensity histograms as well as on a multi-stage algorithm aimed at edge enhancement. The latter was performed applying the Canny algorithm, which consists of image noise reduction, calculation of the intensity gradient for each image, spurious response removal, actual thresholding on corrected images, and confirmation of strong edge boundaries. The approach allowed for the automatic calculation of total, intermuscular and intramuscular fat fractions as percentages of the total slice area. Linear regression models were run to estimate the relationships between the image analysis results and the chemical data, thus allowing for the prediction of the total, intermuscular and intramuscular fat content by the dry-cured ham images. The goodness of fit of the obtained models was confirmed in terms of coefficient of determination (R²), hypothesis testing and pattern of residuals. Good regression models have been found being 0.73, 0.82, and 0.73 the R2 values for the total fat, the sum of intermuscular and intramuscular fat and the intermuscular fraction, respectively. In conclusion, the edge enhancement visual procedure brought to a good fat segmentation making the simple visual approach for the quantification of the different fat fractions in dry-cured ham slices sufficiently simple, accurate and precise. The presented image analysis approach steers towards the development of instruments that can overcome destructive, tedious and time-consuming chemical determinations. As future perspectives, the results of the proposed image analysis methodology will be compared with those of sensory tests in order to develop a fast grading method of dry-cured hams based on fat distribution. Therefore, the system will be able not only to predict the actual fat content but it will also reflect the visual appearance of samples as perceived by consumers.Keywords: dry-cured ham, edge detection algorithm, fat content, image analysis
Procedia PDF Downloads 17620591 Assessing Socio-economic Impacts of Arsenic and Iron Contamination in Groundwater: Feasibility of Rainwater Harvesting in Amdanga Block, North 24 Parganas, West Bengal, India
Authors: Rajkumar Ghosh
Abstract:
The present study focuses on conducting a socio-economic assessment of groundwater contamination by arsenic and iron and explores the feasibility of rainwater harvesting (RWH) as an alternative water source in the Amdanga Block of North 24 Parganas, West Bengal, India. The region is plagued by severe groundwater contamination, primarily due to excessive concentrations of arsenic and iron, which pose significant health risks to the local population. The study utilizes a mixed-methods approach, combining quantitative analysis of water samples collected from different locations within the Amdanga Block and socio-economic surveys conducted among the affected communities. The results reveal alarmingly high levels of arsenic and iron contamination in the groundwater, surpassing the World Health Organization (WHO) and Indian government's permissible limits. This contamination significantly impacts the health and well-being of the local population, leading to a range of health issues such as skin The water samples are analyzed for arsenic and iron levels, while the surveys gather data on water usage patterns, health conditions, and socio-economic factors. lesions, respiratory disorders, and gastrointestinal problems. Furthermore, the socio-economic assessment highlights the vulnerability of the affected communities due to limited access to safe drinking water. The findings reveal the adverse socio-economic implications, including increased medical expenditures, reduced productivity, and compromised educational opportunities. To address these challenges, the study explores the feasibility of rainwater harvesting as an alternative source of clean water. RWH systems have the potential to mitigate groundwater contamination by providing a sustainable and independent water supply. The assessment includes evaluating the rainwater availability, analyzing the infrastructure requirements, and estimating the potential benefits and challenges associated with RWH implementation in the study area. The findings of this study contribute to a comprehensive understanding of the socio-economic impact of groundwater contamination by arsenic and iron, emphasizing the urgency to address this critical issue in the Amdanga Block. The feasibility assessment of rainwater harvesting serves as a practical solution to ensure a safe and sustainable water supply, reducing the dependency on contaminated groundwater sources. The study's results can inform policymakers, researchers, and local stakeholders in implementing effective mitigation measures and promoting the adoption of rainwater harvesting as a viable alternative in similar arsenic and iron-contaminated regions.Keywords: contamination, rainwater harvesting, groundwater, sustainable water supply
Procedia PDF Downloads 9920590 The Evolution of National Technological Capability Roles From the Perspective of Researcher’s Transfer: A Case Study of Artificial Intelligence
Authors: Yating Yang, Xue Zhang, Chengli Zhao
Abstract:
Technology capability refers to the comprehensive ability that influences all factors of technological development. Among them, researchers’ resources serve as the foundation and driving force for technology capability, representing a significant manifestation of a country/region's technological capability. Therefore, the cross-border transfer behavior of researchers to some extent reflects changes in technological capability between countries/regions, providing a unique research perspective for technological capability assessment. This paper proposes a technological capability assessment model based on personnel transfer networks, which consists of a researchers' transfer network model and a country/region role evolution model. It evaluates the changes in a country/region's technological capability roles from the perspective of researcher transfers and conducts an analysis using artificial intelligence as a case study based on literature data. The study reveals that the United States, China, and the European Union are core nodes, and identifies the role evolution characteristics of several major countries/regions.Keywords: transfer network, technological capability assessment, central-peripheral structure, role evolution
Procedia PDF Downloads 9320589 A Review of Benefit-Risk Assessment over the Product Lifecycle
Authors: M. Miljkovic, A. Urakpo, M. Simic-Koumoutsaris
Abstract:
Benefit-risk assessment (BRA) is a valuable tool that takes place in multiple stages during a medicine's lifecycle, and this assessment can be conducted in a variety of ways. The aim was to summarize current BRA methods used during approval decisions and in post-approval settings and to see possible future directions. Relevant reviews, recommendations, and guidelines published in medical literature and through regulatory agencies over the past five years have been examined. BRA implies the review of two dimensions: the dimension of benefits (determined mainly by the therapeutic efficacy) and the dimension of risks (comprises the safety profile of a drug). Regulators, industry, and academia have developed various approaches, ranging from descriptive textual (qualitative) to decision-analytic (quantitative) models, to facilitate the BRA of medicines during the product lifecycle (from Phase I trials, to authorization procedure, post-marketing surveillance and health technology assessment for inclusion in public formularies). These approaches can be classified into the following categories: stepwise structured approaches (frameworks); measures for benefits and risks that are usually endpoint specific (metrics), simulation techniques and meta-analysis (estimation techniques), and utility survey techniques to elicit stakeholders’ preferences (utilities). All these approaches share the following two common goals: to assist this analysis and to improve the communication of decisions, but each is subject to its own specific strengths and limitations. Before using any method, its utility, complexity, the extent to which it is established, and the ease of results interpretation should be considered. Despite widespread and long-time use, BRA is subject to debate, suffers from a number of limitations, and currently is still under development. The use of formal, systematic structured approaches to BRA for regulatory decision-making and quantitative methods to support BRA during the product lifecycle is a standard practice in medicine that is subject to continuous improvement and modernization, not only in methodology but also in cooperation between organizations.Keywords: benefit-risk assessment, benefit-risk profile, product lifecycle, quantitative methods, structured approaches
Procedia PDF Downloads 15520588 A Drawing Software for Designers: AutoCAD
Authors: Mayar Almasri, Rosa Helmi, Rayana Enany
Abstract:
This report describes the features of AutoCAD software released by Adobe. It explains how the program makes it easier for engineers and designers and reduces their time and effort spent using AutoCAD. Moreover, it highlights how AutoCAD works, how some of the commands used in it, such as Shortcut, make it easy to use, and features that make it accurate in measurements. The results of the report show that most users of this program are designers and engineers, but few people know about it and find it easy to use. They prefer to use it because it is easy to use, and the shortcut commands shorten a lot of time for them. The feature got a high rate and some suggestions for improving AutoCAD in Aperture, but it was a small percentage, and the highest percentage was that they didn't need to improve the program, and it was good.Keywords: artificial intelligence, design, planning, commands, autodesk, dimensions
Procedia PDF Downloads 13120587 Interaction of Low-Impact Development Techniques and Urban River Flooding on the Zoning – Case Study Qomroud
Authors: Mohammad Reza Kavianpour, Arsalan Behzadifard Pour, Ali Aghazadeh Cloudy, Abolfazl Moqimi
Abstract:
In recent decades, and with increasing of urban population and development of the city, the amount of impermeable surfaces has been increased. This cause urban runoff enhancement. This enhancement, especially in cities with urban river, increases the possibility of urban flooding caused by the river flooding interaction and urban runoff. In this research, we tried SWMM utilizes software development methods and practices that seek to reduce the impact of runoff to the river flows to reduce Qomroud and Effects using Arc GIS and HEC-RAS software on how we see the flood zone.Keywords: flood management, SWMM, runoff, flood zone
Procedia PDF Downloads 612