Search results for: modern methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17685

Search results for: modern methods

12885 Queuing Analysis and Optimization of Public Vehicle Transport Stations: A Case of South West Ethiopia Region Vehicle Stations

Authors: Mequanint Birhan

Abstract:

Modern urban environments present a dynamically growing field where, notwithstanding shared goals, several mutually conflicting interests frequently collide. However, it has a big impact on the city's socioeconomic standing, waiting lines and queues are common occurrences. This results in extremely long lines for both vehicles and people on incongruous routes, service coagulation, customer murmuring, unhappiness, complaints, and looking for other options sometimes illegally. The root cause of this is corruption, which leads to traffic jams, stopping, and packing vehicles beyond their safe carrying capacity, and violating the human rights and freedoms of passengers. This study focused on the optimizing time of passengers had to wait in public vehicle stations. This applied research employed both data gathering sources and mixed approaches, then 166 samples of key informants of transport station were taken by using the Slovin sampling formula. The length of time vehicles, including the drivers and auxiliary drivers ‘Weyala', had to wait was also studied. To maximize the service level at vehicle stations, a queuing model was subsequently devised ‘Menaharya’. Time, cost, and quality encompass performance, scope, and suitability for the intended purposes. The minimal response time for passengers and vehicles queuing to reach their final destination at the stations of the Tepi, Mizan, and Bonga towns was determined. A new bus station system was modeled and simulated by Arena simulation software in the chosen study area. 84% improvement on cost reduced by 56.25%, time 4hr to 1.5hr, quality, safety and designed load performance calculations employed. Stakeholders are asked to put the model into practice and monitor the results obtained.

Keywords: Arena 14 automatic rockwell, queue, transport services, vehicle stations

Procedia PDF Downloads 81
12884 Shear Strength and Consolidation Behavior of Clayey Soil with Vertical and Radial Drainage

Authors: R. Pillai Aparna, S. R. Gandhi

Abstract:

Soft clay deposits having low strength and high compressibility are found all over the world. Preloading with vertical drains is a widely used method for improving such type of soils. The coefficient of consolidation, irrespective of the drainage type, plays an important role in the design of vertical drains and it controls accurate prediction of the rate of consolidation of soil. Also, the increase in shear strength of soil with consolidation is another important factor considered in preloading or staged construction. To our best knowledge no clear guidelines are available to estimate the increase in shear strength for a particular degree of consolidation (U) at various stages during the construction. Various methods are available for finding out the consolidation coefficient. This study mainly focuses on the variation of, consolidation coefficient which was found out using different methods and shear strength with pressure intensity. The variation of shear strength with the degree of consolidation was also studied. The consolidation test was done using two types of highly compressible clays with vertical, radial and a few with combined drainage. The test was carried out at different pressures intensities and for each pressure intensity, once the target degree of consolidation is achieved, vane shear test was done at different locations in the sample, in order to determine the shear strength. The shear strength of clayey soils under the application of vertical stress with vertical and radial drainage with target U value of 70% and 90% was studied. It was found that there is not much variation in cv or cr value beyond 80kPa pressure intensity. Correlations were developed between shear strength ratio and consolidation pressure based on laboratory testing under controlled condition. It was observed that the shear strength of sample with target U value of 90% is about 1.4 to 2 times than that of 70% consolidated sample. Settlement analysis was done using Asaoka’s and hyperbolic method. The variation of strength with respect to the depth of sample was also studied, using large-scale consolidation test. It was found, based on the present study that the gain in strength is more on the top half of the clay layer, and also the shear strength of the sample ensuring radial drainage is slightly higher than that of the vertical drainage.

Keywords: consolidation coefficient, degree of consolidation, PVDs, shear strength

Procedia PDF Downloads 242
12883 Wheat Dihaploid and Somaclonal Lines Screening for Resistance to P. nodorum

Authors: Lidia Kowalska, Edward Arseniuk

Abstract:

Glume and leaf blotch is a disease of wheat caused by necrotrophic fungus Parastagonospora nodorum. It is a serious pathogen in many wheat-growing areas throughout the world. Use of resistant cultivars is the most effective and economical means to control the above-mentioned disease. Plant breeders and pathologists have worked intensively to incorporate resistance to the pathogen in new cultivars. Conventional methods of breeding for resistance can be supported by using the biotechnological ones, i.e., somatic embryogenesis and androgenesis. Therefore, an effort was undertaken to compare genetic variation in P. nodorum resistance among winter wheat somaclones, dihaploids and conventional varieties. For the purpose, a population of 16 somaclonal and 4 dihaploid wheat lines from six crosses were used to assess their resistance to P. nodorum under field conditions. Lines were grown in disease-free (fungicide protected) and inoculated micro plots in 2 replications of a split-plot design in a single environment. The plant leaves were inoculated with a mixture of P. nodorum isolates three times. Spore concentrations were adjusted to 4 x 10⁶ of viable spores per one milliliter. The disease severity was rated on a scale, where > 90% – susceptible, < 10% - resistant. Disease ratings of plant leaves showed statistically significant differences among all lines tested. Higher resistance to P. nodorum was observed more often on leaves of somaclonal lines than on dihaploid ones. On average, disease, severity reached 15% on leaves of somaclones and 30% on leaves of dihaploids. Some of the genotypes were showing low leaf infection, e.g. dihaploid D-33 (disease severity 4%) and a somaclone S-1 (disease severity 2%). The results from this study prove that dihaploid and somaclonal variation might be successfully used as an additional source of wheat resistance to the pathogen and it could be recommended to use in commercial breeding programs. The reported results prove that biotechnological methods may effectively be used in breeding for disease resistance of wheat to fungal necrotrophic pathogens.

Keywords: glume and leaf blotch, somaclonal, androgenic variation, wheat, resistance breeding

Procedia PDF Downloads 127
12882 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images

Authors: Amit Kumar Happy

Abstract:

This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.

Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform

Procedia PDF Downloads 118
12881 Measurement of CES Production Functions Considering Energy as an Input

Authors: Donglan Zha, Jiansong Si

Abstract:

Because of its flexibility, CES attracts much interest in economic growth and programming models, and the macroeconomics or micro-macro models. This paper focuses on the development, estimating methods of CES production function considering energy as an input. We leave for future research work of relaxing the assumption of constant returns to scale, the introduction of potential input factors, and the generalization method of the optimal nested form of multi-factor production functions.

Keywords: bias of technical change, CES production function, elasticity of substitution, energy input

Procedia PDF Downloads 284
12880 Quality is the Matter of All

Authors: Mohamed Hamza, Alex Ohoussou

Abstract:

At JAWDA, our primary focus is on ensuring the satisfaction of our clients worldwide. We are committed to delivering new features on our SaaS platform as quickly as possible while maintaining high-quality standards. In this paper, we highlight two key aspects of testing that represent an evolution of current methods and a potential trend for the future, which have enabled us to uphold our commitment effectively. These aspects are: "One Sandbox per Pull Request" (dynamic test environments instead of static ones) and "QA for All.".

Keywords: QA for all, dynamic sandboxes, QAOPS, CICD, continuous testing, all testers, QA matters for all, 1 sandbox per PR, utilization rate, coverage rate

Procedia PDF Downloads 36
12879 One-Class Classification Approach Using Fukunaga-Koontz Transform and Selective Multiple Kernel Learning

Authors: Abdullah Bal

Abstract:

This paper presents a one-class classification (OCC) technique based on Fukunaga-Koontz Transform (FKT) for binary classification problems. The FKT is originally a powerful tool to feature selection and ordering for two-class problems. To utilize the standard FKT for data domain description problem (i.e., one-class classification), in this paper, a set of non-class samples which exist outside of positive class (target class) describing boundary formed with limited training data has been constructed synthetically. The tunnel-like decision boundary around upper and lower border of target class samples has been designed using statistical properties of feature vectors belonging to the training data. To capture higher order of statistics of data and increase discrimination ability, the proposed method, termed one-class FKT (OC-FKT), has been extended to its nonlinear version via kernel machines and referred as OC-KFKT for short. Multiple kernel learning (MKL) is a favorable family of machine learning such that tries to find an optimal combination of a set of sub-kernels to achieve a better result. However, the discriminative ability of some of the base kernels may be low and the OC-KFKT designed by this type of kernels leads to unsatisfactory classification performance. To address this problem, the quality of sub-kernels should be evaluated, and the weak kernels must be discarded before the final decision making process. MKL/OC-FKT and selective MKL/OC-FKT frameworks have been designed stimulated by ensemble learning (EL) to weight and then select the sub-classifiers using the discriminability and diversities measured by eigenvalue ratios. The eigenvalue ratios have been assessed based on their regions on the FKT subspaces. The comparative experiments, performed on various low and high dimensional data, against state-of-the-art algorithms confirm the effectiveness of our techniques, especially in case of small sample size (SSS) conditions.

Keywords: ensemble methods, fukunaga-koontz transform, kernel-based methods, multiple kernel learning, one-class classification

Procedia PDF Downloads 28
12878 Legal Aspects in Character Merchandising with Reference to Right to Image of Celebrities

Authors: W. R. M. Shehani Shanika

Abstract:

Selling goods and services using images, names and personalities of celebrities has become a common marketing strategy identified in modern physical and online markets. Two concepts called globalization and open economy have given numerous reasons to develop businesses to earn higher profits. Therefore, global market plus domestic markets in various countries have vigorously endorsing images of famous sport stars, film stars, singing stars and cartoon characters for the purpose of increasing demand for goods and services rendered by them. It has been evident that these trade strategies have become a threat to famous personalities in financially and personally. Right to the image is a basic human right which celebrities owned to avoid themselves from various commercial exploitations. In this respect, this paper aims to assess whether the law relating to character merchandising satisfactorily protects right to image of celebrities. However, celebrities can decide how much they receive for each representation to the general public. Simply they have exclusive right to decide monetary value for their image. But most commonly every country uses law relating to unfair competition to regulate matters arise thereof. Legal norms in unfair competition are not enough to protect image of celebrities. Therefore, celebrities must be able to avoid unauthorized use of their images for commercial purposes by fraudulent traders and getting unjustly enriched, as their images have economic value. They have the right for use their image for any commercial purpose and earn profits. Therefore it is high time to recognize right to image as a new dimension to be protected in the legal framework of character merchandising. Unfortunately, to the author’s best knowledge there are no any uniform, single international standard which recognizes right to the image of celebrities in the context of character merchandising. The paper identifies it as a controversial legal barrier faced by celebrities in the rapidly evolving marketplace. Finally, this library-based research concludes with proposals to ensure the right to image more broadly in the legal context of character merchandising.

Keywords: brand endorsement, celebrity, character merchandising, intellectual property rights, right to image, unfair competition

Procedia PDF Downloads 141
12877 Collaborative Procurement in the Pursuit of Net- Zero: A Converging Journey

Authors: Bagireanu Astrid, Bros-Williamson Julio, Duncheva Mila, Currie John

Abstract:

The Architecture, Engineering, and Construction (AEC) sector plays a critical role in the global transition toward sustainable and net-zero built environments. However, the industry faces unique challenges in planning for net-zero while struggling with low productivity, cost overruns and overall resistance to change. Traditional practices fall short due to their inability to meet the requirements for systemic change, especially as governments increasingly demand transformative approaches. Working in silos and rigid hierarchies and a short-term, client-centric approach prioritising immediate gains over long-term benefit stands in stark contrast to the fundamental requirements for the realisation of net-zero objectives. These practices have limited capacity to effectively integrate AEC stakeholders and promote the essential knowledge sharing required to address the multifaceted challenges of achieving net-zero. In the context of built environment, procurement may be described as the method by which a project proceeds from inception to completion. Collaborative procurement methods under the Integrated Practices (IP) umbrella have the potential to align more closely with net-zero objectives. This paper explores the synergies between collaborative procurement principles and the pursuit of net zero in the AEC sector, drawing upon the shared values of cross-disciplinary collaboration, Early Supply Chain involvement (ESI), use of standards and frameworks, digital information management, strategic performance measurement, integrated decision-making principles and contractual alliancing. To investigate the role of collaborative procurement in advancing net-zero objectives, a structured research methodology was employed. First, the study focuses on a systematic review on the application of collaborative procurement principles in the AEC sphere. Next, a comprehensive analysis is conducted to identify common clusters of these principles across multiple procurement methods. An evaluative comparison between traditional procurement methods and collaborative procurement for achieving net-zero objectives is presented. Then, the study identifies the intersection between collaborative procurement principles and the net-zero requirements. Lastly, an exploration of key insights for AEC stakeholders focusing on the implications and practical applications of these findings is made. Directions for future development of this research are recommended. Adopting collaborative procurement principles can serve as a strategic framework for guiding the AEC sector towards realising net-zero. Synergising these approaches overcomes fragmentation, fosters knowledge sharing, and establishes a net-zero-centered ecosystem. In the context of the ongoing efforts to amplify project efficiency within the built environment, a critical realisation of their central role becomes imperative for AEC stakeholders. When effectively leveraged, collaborative procurement emerges as a powerful tool to surmount existing challenges in attaining net-zero objectives.

Keywords: collaborative procurement, net-zero, knowledge sharing, architecture, built environment

Procedia PDF Downloads 79
12876 Cluster Analysis of Students’ Learning Satisfaction

Authors: Purevdolgor Luvsantseren, Ajnai Luvsan-Ish, Oyuntsetseg Sandag, Javzmaa Tsend, Akhit Tileubai, Baasandorj Chilhaasuren, Jargalbat Puntsagdash, Galbadrakh Chuluunbaatar

Abstract:

One of the indicators of the quality of university services is student satisfaction. Aim: We aimed to study the level of satisfaction of students in the first year of premedical courses in the course of Medical Physics using the cluster method. Materials and Methods: In the framework of this goal, a questionnaire was collected from a total of 324 students who studied the medical physics course of the 1st course of the premedical course at the Mongolian National University of Medical Sciences. When determining the level of satisfaction, the answers were obtained on five levels of satisfaction: "excellent", "good", "medium", "bad" and "very bad". A total of 39 questionnaires were collected from students: 8 for course evaluation, 19 for teacher evaluation, and 12 for student evaluation. From the research, a database with 39 fields and 324 records was created. Results: In this database, cluster analysis was performed in MATLAB and R programs using the k-means method of data mining. Calculated the Hopkins statistic in the created database, the values are 0.88, 0.87, and 0.97. This shows that cluster analysis methods can be used. The course evaluation sub-fund is divided into three clusters. Among them, cluster I has 150 objects with a "good" rating of 46.2%, cluster II has 119 objects with a "medium" rating of 36.7%, and Cluster III has 54 objects with a "good" rating of 16.6%. The teacher evaluation sub-base into three clusters, there are 179 objects with a "good" rating of 55.2% in cluster II, 108 objects with an "average" rating of 33.3% in cluster III, and 36 objects with an "excellent" rating in cluster I of 11.1%. The sub-base of student evaluations is divided into two clusters: cluster II has 215 objects with an "excellent" rating of 66.3%, and cluster I has 108 objects with an "excellent" rating of 33.3%. Evaluating the resulting clusters with the Silhouette coefficient, 0.32 for the course evaluation cluster, 0.31 for the teacher evaluation cluster, and 0.30 for student evaluation show statistical significance. Conclusion: Finally, to conclude, cluster analysis in the model of the medical physics lesson “good” - 46.2%, “middle” - 36.7%, “bad” - 16.6%; 55.2% - “good”, 33.3% - “middle”, 11.1% - “bad” in the teacher evaluation model; 66.3% - “good” and 33.3% of “bad” in the student evaluation model.

Keywords: questionnaire, data mining, k-means method, silhouette coefficient

Procedia PDF Downloads 55
12875 Finite Volume Method in Loop Network in Hydraulic Transient

Authors: Hossain Samani, Mohammad Ehteram

Abstract:

In this paper, we consider finite volume method (FVM) in water hammer. We will simulate these techniques on a looped network with complex boundary conditions. After comparing methods, we see the FVM method as the best method. We compare the results of FVM with experimental data. Finite volume using staggered grid is applied for solving water hammer equations.

Keywords: hydraulic transient, water hammer, interpolation, non-liner interpolation

Procedia PDF Downloads 352
12874 Alumni Experiences of How Their Undergraduate Medical Education Instilled and Fostered a Commitment to Community-Based Work in Later Life: A Sequential Exploratory Mixed-Methods Study

Authors: Harini Aiyer, Kalyani Premkumar

Abstract:

Health professionals are the key players who can help achieve the goals of population health equity. Social accountability (SA) of health professionals emphasizes their role in addressing issues of equity in the population they serve. Therefore, health professional education must focus on instilling SA in health professionals. There is limited literature offering a longitudinal perspective of how students sustain the practice of SA in later life. This project aims to identify the drivers of social accountability among physicians. This study employed an exploratory mixed methods design (QUAL-> Quant) to explore alumni perceptions and experiences. The qualitative data, collected via 20 in-depth, semi-structured interviews, provided an understanding of the perceptions of the alumni regarding the influence of their undergraduate learning environment on their SA. This was followed by a quantitative portion -a questionnaire designed from the themes identified from the qualitative data. Emerging themes from the study highlighted community-centered education and a focus on social and preventative medicine in both curricular and non-curricular facilitators of SA among physicians. Curricular components included opportunities to engage with the community, such as roadside clinics, community-orientation programs, and postings at a secondary hospital. Other facilitators that emerged were the faculty leading by example, a subsidized fee structure, and a system that prepared students for practice in rural and remote areas. The study offers a fresh perspective and dimension on how SA is addressed by medical schools. The findings may be adapted by medical schools to understand how their own SA initiatives have been sustained among physicians over the long run.

Keywords: community-based work, global health, health education, medical education, providing health in remote areas, social accountability

Procedia PDF Downloads 85
12873 System-Driven Design Process for Integrated Multifunctional Movable Concepts

Authors: Oliver Bertram, Leonel Akoto Chama

Abstract:

In today's civil transport aircraft, the design of flight control systems is based on the experience gained from previous aircraft configurations with a clear distinction between primary and secondary flight control functions for controlling the aircraft altitude and trajectory. Significant system improvements are now seen particularly in multifunctional moveable concepts where the flight control functions are no longer considered separate but integral. This allows new functions to be implemented in order to improve the overall aircraft performance. However, the classical design process of flight controls is sequential and insufficiently interdisciplinary. In particular, the systems discipline is involved only rudimentarily in the early phase. In many cases, the task of systems design is limited to meeting the requirements of the upstream disciplines, which may lead to integration problems later. For this reason, approaching design with an incremental development is required to reduce the risk of a complete redesign. Although the potential and the path to multifunctional moveable concepts are shown, the complete re-engineering of aircraft concepts with less classic moveable concepts is associated with a considerable risk for the design due to the lack of design methods. This represents an obstacle to major leaps in technology. This gap in state of the art is even further increased if, in the future, unconventional aircraft configurations shall be considered, where no reference data or architectures are available. This means that the use of the above-mentioned experience-based approach used for conventional configurations is limited and not applicable to the next generation of aircraft. In particular, there is a need for methods and tools for a rapid trade-off between new multifunctional flight control systems architectures. To close this gap in the state of the art, an integrated system-driven design process for multifunctional flight control systems of non-classical aircraft configurations will be presented. The overall goal of the design process is to find optimal solutions for single or combined target criteria in a fast process from the very large solution space for the flight control system. In contrast to the state of the art, all disciplines are involved for a holistic design in an integrated rather than a sequential process. To emphasize the systems discipline, this paper focuses on the methodology for designing moveable actuation systems in the context of this integrated design process of multifunctional moveables. The methodology includes different approaches for creating system architectures, component design methods as well as the necessary process outputs to evaluate the systems. An application example of a reference configuration is used to demonstrate the process and validate the results. For this, new unconventional hydraulic and electrical flight control system architectures are calculated which result from the higher requirements for multifunctional moveable concept. In addition to typical key performance indicators such as mass and required power requirements, the results regarding the feasibility and wing integration aspects of the system components are examined and discussed here. This is intended to show how the systems design can influence and drive the wing and overall aircraft design.

Keywords: actuation systems, flight control surfaces, multi-functional movables, wing design process

Procedia PDF Downloads 145
12872 Enhanced Disk-Based Databases towards Improved Hybrid in-Memory Systems

Authors: Samuel Kaspi, Sitalakshmi Venkatraman

Abstract:

In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable in-memory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of disk-based database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of in-memory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.

Keywords: in-memory database, disk-based system, hybrid database, concurrency control

Procedia PDF Downloads 422
12871 Community-Based Settlement Environment in Malalayang Coastal Area, Manado City

Authors: Teguh R. Hakim, Frenny F. F. Kairupan, Alberta M. Mantiri

Abstract:

The face of the coastal city is generally the same as other cities face showing the dualistic, traditional and modern, rural and urbanity, planned and unplanned, slum and high quality. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Urban environmental problems ever occurred in this city, which is the impact of dualistic urban. Overcrowding, inadequate infrastructure, and limited human resources become the main cause of untidiness the coastal settlements in Malalayang. This has an impact on the activities of social, economic, public health level in the environment of coastal City of Manado, Malalayang. This is becoming a serious problem which must be tackled jointly by the government, private parties, and the community. Community-based settlement environment setup, into one solution to realize the city's coastal settlements livable. As for this research aims to analyze the involvement of local communities in arrangements of the settlement. The participatory approach of the model used in this study. Its application is mainly at macro and meso-scale (region, city, and environment) or community architecture. Model participatory approach leads more operational research approach to find a solution/answer to the problems of settlement. The participatory approach is a model for research that involves researchers and society as an object at the same time the subject of research, which in the process in addition to researching also developed other forms of participation in the design and build together. The expected results of this study were able to provide education to the community about environmental and set up a livable settlement for the sake of improving the quality of life. The study also becomes inputs to the government in applying the pattern of development that will be implemented in the future.

Keywords: arrangements the coastal environment, community participation, urban environmental problems, livable settlement

Procedia PDF Downloads 244
12870 Comparison of Non-destructive Devices to Quantify the Moisture Content of Bio-Based Insulation Materials on Construction Sites

Authors: Léa Caban, Lucile Soudani, Julien Berger, Armelle Nouviaire, Emilio Bastidas-Arteaga

Abstract:

Improvement of the thermal performance of buildings is a high concern for the construction industry. With the increase in environmental issues, new types of construction materials are being developed. These include bio-based insulation materials. They capture carbon dioxide, can be produced locally, and have good thermal performance. However, their behavior with respect to moisture transfer is still facing some issues. With a high porosity, the mass transfer is more important in those materials than in mineral insulation ones. Therefore, they can be more sensitive to moisture disorders such as mold growth, condensation risks or decrease of the wall energy efficiency. For this reason, the initial moisture content on the construction site is a piece of crucial knowledge. Measuring moisture content in a laboratory is a mastered task. Diverse methods exist but the easiest and the reference one is gravimetric. A material is weighed dry and wet, and its moisture content is mathematically deduced. Non-destructive methods (NDT) are promising tools to determine in an easy and fast way the moisture content in a laboratory or on construction sites. However, the quality and reliability of the measures are influenced by several factors. Classical NDT portable devices usable on-site measure the capacity or the resistivity of materials. Water’s electrical properties are very different from those of construction materials, which is why the water content can be deduced from these measurements. However, most moisture meters are made to measure wooden materials, and some of them can be adapted for construction materials with calibration curves. Anyway, these devices are almost never calibrated for insulation materials. The main objective of this study is to determine the reliability of moisture meters in the measurement of biobased insulation materials. The determination of which one of the capacitive or resistive methods is the most accurate and which device gives the best result is made. Several biobased insulation materials are tested. Recycled cotton, two types of wood fibers of different densities (53 and 158 kg/m3) and a mix of linen, cotton, and hemp. It seems important to assess the behavior of a mineral material, so glass wool is also measured. An experimental campaign is performed in a laboratory. A gravimetric measurement of the materials is carried out for every level of moisture content. These levels are set using a climatic chamber and by setting the relative humidity level for a constant temperature. The mass-based moisture contents measured are considered as references values, and the results given by moisture meters are compared to them. A complete analysis of the uncertainty measurement is also done. These results are used to analyze the reliability of moisture meters depending on the materials and their water content. This makes it possible to determine whether the moisture meters are reliable, and which one is the most accurate. It will then be used for future measurements on construction sites to assess the initial hygrothermal state of insulation materials, on both new-build and renovation projects.

Keywords: capacitance method, electrical resistance method, insulation materials, moisture transfer, non-destructive testing

Procedia PDF Downloads 131
12869 Constitutional Status of a Child in the Republic of Belarus and Its Principles

Authors: Maria Ashitko

Abstract:

The Constitution of the Republic of Belarus is based on the principle of the unity of rights and obligations, including those of the child. The constitutional status of the child is aspecific system of constitutional elements established and guaranteed by the state through the current legislation and regulatory acts that ensure the special legal status of the child, his or her constitutional legal capacity, implementation of the principles of the constitutional and legal status of the child, constitutional rights of the child and their safeguards. Under the principles of the constitutional status of the child, we consider the general, normative, social-volitional rules of behavior established by the Constitution of the Republic of Belarus, laws and other regulatory acts that determine the content and social purpose of the legal status of the child. The constitutional and legal status of the child is characterized by the following special principles, which form a feature of the state legal system:1) Ensuring the interests of the child means providing for the child in accordance with his or her age, state of health, characteristics of development, life experience, family life, cultural traditions, ethnicity. 2) The principle of equal responsibility of both parents or their substitutes characterized by caring for the next generation as one of the priority tasks of the state and society, and all issues related to the implementation of children’s rights should be addressed at the constitutional level. 3) We would like to highlight such a special principle as the subprinciple of safeguards, which is the principle of ensuring the safety of the child. It is also worth noting that in legal studies, there is no relationship between safety and constitutional rights as general safeguards of individual rights and freedoms, and as special safeguards for the right to life. 4) The principle of justice is expressed by the fact that in modern conditions, the quality of life is determined not only by material wealth but also by the ability of the state to ensure the harmonization of social relations and social harmony on the basis of humanism and justice. Thus, the specificity of the constitutional status of the child is the age boundary between adulthood and minority; therefore, we propose to highlight the age characteristics of the child as an additional element. It is advisable to highlight such a special principle as the subprinciple of safeguards, which is the principle of ensuring the safety of the child.

Keywords: children’s rights, constitutional status, constitutional principles, constitutional rights

Procedia PDF Downloads 132
12868 Using of M Health in MCH Service during COVID-19: Application of Diffusion of Innovation Theory

Authors: Mikiyas Yonas Fufa

Abstract:

- Maternal and child health service was a critical service which may have many risks and many maternal and newborn mortality is there if not managed properly. In middle and low countries like Ethiopia accessibility and quality of MCH service is low. During this COVID-19 Pandemics even the pervious access of MCH will be decreased. So many pregnant mothers are not attending their ANC, Delivery and other services in the hospital because they think they are more vulnerable to COVID-19. This condition may make an increase of maternal and neonatal morbidity and mortality. The innovation is an idea (which is development of a mobile app prepared by Maternity Foundation organization that focuses on midwifery care. The app has detailed videos on danger signs in pregnancy and procedures during labor and delivery). By telling this to clients it is planned to explore the perception, attitude towards this innovation and barriers to accepting it. What is planned to study is to explore the perceptions and barriers towards using of new idea which is innovation of mHealth on the MCH services. It is planned to interview the pregnant mothers who come for ANC at health facility and mothers who are absent from their appointment of services. In this way it is planned to explore how the mothers accept this idea and what barriers make them from accepting this idea. This is a phenomenological qualitative study and application of diffusion of innovation theory on the MCH services. The participant will be selected by using quota sampling methods for the mother who are interviewed at hospitals and snowball/quota sampling methods for the mother who are absent from their appointment/visits. Sample size of the participant depends on the saturation of data/idea. Each participant will be interviewed based the open-ended questionnaires, and the interview will be recorded then transcribed then finally analyzed by the open code 4.03. Beneficiaries: The federal ministry of health prepares them to develop the apk of mhealth. Health professionals in the MCH will have a low overload and accessibility and the quality of care will be increased during COVID-19 Different collaborations will be participated and promote the mother to enjoy the new idea.

Keywords: COVID-19, m health, MCH, diffusion of innovation

Procedia PDF Downloads 33
12867 Robust Numerical Solution for Flow Problems

Authors: Gregor Kosec

Abstract:

Simple and robust numerical approach for solving flow problems is presented, where involved physical fields are represented through the local approximation functions, i.e., the considered field is approximated over a local support domain. The approximation functions are then used to evaluate the partial differential operators. The type of approximation, the size of support domain, and the type and number of basis function can be general. The solution procedure is formulated completely through local computational operations. Besides local numerical method also the pressure velocity is performed locally with retaining the correct temporal transient. The complete locality of the introduced numerical scheme has several beneficial effects. One of the most attractive is the simplicity since it could be understood as a generalized Finite Differences Method, however, much more powerful. Presented methodology offers many possibilities for treating challenging cases, e.g. nodal adaptivity to address regions with sharp discontinuities or p-adaptivity to treat obscure anomalies in physical field. The stability versus computation complexity and accuracy can be regulated by changing number of support nodes, etc. All these features can be controlled on the fly during the simulation. The presented methodology is relatively simple to understand and implement, which makes it potentially powerful tool for engineering simulations. Besides simplicity and straightforward implementation, there are many opportunities to fully exploit modern computer architectures through different parallel computing strategies. The performance of the method is presented on the lid driven cavity problem, backward facing step problem, de Vahl Davis natural convection test, extended also to low Prandtl fluid and Darcy porous flow. Results are presented in terms of velocity profiles, convergence plots, and stability analyses. Results of all cases are also compared against published data.

Keywords: fluid flow, meshless, low Pr problem, natural convection

Procedia PDF Downloads 237
12866 A Strategic Performance Control System for Municipal Organization

Authors: Emin Gundogar, Aysegul Yilmaz

Abstract:

Strategic performance control is a significant procedure in management. There are various methods to improve this procedure. This study introduces an information system that is developed to score performance for municipal management. The application of the system is clarified by exemplifying municipal processes.

Keywords: management information system, municipal management, performance control

Procedia PDF Downloads 479
12865 Enhancing Wheat Productivity for Small-Scale Farmers in the Northern State of Sudan through Developing a Local Made Seed Cleaner and Different Seeding Methods

Authors: Yasir Hassan Satti Mohammed

Abstract:

The wheat cleaner was designed, manufactured, and tested in the workshop of the department of agricultural engineering, faculty of agricultural sciences, university of Dongola, the northern state of Sudan, for the purpose of enhancing productivity for small-scale-farmers who used to plant their saved wheat seeds every season with all risk of weed infestation and low viability. A one-season field experiment was then conducted according to the Randomized Complete Block Design (RCBD) experimental design in the demonstration farm of Dongola research station using clean seeds and unclean seeds of a local wheat variety (Imam); two different planting methods were also adopted in the experiment. One is the traditional seed drilling within the recommended seed rate (50 kg.feddan⁻¹), whereas the other was the precision seeding method using half of the recommended seed rate (25 kg.feddan⁻¹). The effect of seed type and planting method on field parameters were investigated, and the data was then analyzed using a computer application SAS system version 9.3. The results revealed significant (P ≥ 0.05) and highly significant (P ≥ 0.01) differences between treatments. The precision seeding method with clean seeds increased the number of kernels per spike (KS), tillers per plant (TPP), one thousand kernels mass (TKM), the biomass of wheat (BWT), and total yield (TOY), whereas weeds per area (WSM), the biomass of weeds (BWD) and weight of weed seeds were apparently decreased compared to seed drilling with unclean seed. Wheat seed cleaner could be of great benefit for small-scale wheat farmers in Sudan who cannot afford the cleaned seeds commercially provided by the local government.

Keywords: wheat cleaner, precision seeding, seed drilling method, small-scale farmers

Procedia PDF Downloads 97
12864 Strategies for Synchronizing Chocolate Conching Data Using Dynamic Time Warping

Authors: Fernanda A. P. Peres, Thiago N. Peres, Flavio S. Fogliatto, Michel J. Anzanello

Abstract:

Batch processes are widely used in food industry and have an important role in the production of high added value products, such as chocolate. Process performance is usually described by variables that are monitored as the batch progresses. Data arising from these processes are likely to display a strong correlation-autocorrelation structure, and are usually monitored using control charts based on multiway principal components analysis (MPCA). Process control of a new batch is carried out comparing the trajectories of its relevant process variables with those in a reference set of batches that yielded products within specifications; it is clear that proper determination of the reference set is key for the success of a correct signalization of non-conforming batches in such quality control schemes. In chocolate manufacturing, misclassifications of non-conforming batches in the conching phase may lead to significant financial losses. In such context, the accuracy of process control grows in relevance. In addition to that, the main assumption in MPCA-based monitoring strategies is that all batches are synchronized in duration, both the new batch being monitored and those in the reference set. Such assumption is often not satisfied in chocolate manufacturing process. As a consequence, traditional techniques as MPCA-based charts are not suitable for process control and monitoring. To address that issue, the objective of this work is to compare the performance of three dynamic time warping (DTW) methods in the alignment and synchronization of chocolate conching process variables’ trajectories, aimed at properly determining the reference distribution for multivariate statistical process control. The power of classification of batches in two categories (conforming and non-conforming) was evaluated using the k-nearest neighbor (KNN) algorithm. Real data from a milk chocolate conching process was collected and the following variables were monitored over time: frequency of soybean lecithin dosage, rotation speed of the shovels, current of the main motor of the conche, and chocolate temperature. A set of 62 batches with durations between 495 and 1,170 minutes was considered; 53% of the batches were known to be conforming based on lab test results and experts’ evaluations. Results showed that all three DTW methods tested were able to align and synchronize the conching dataset. However, synchronized datasets obtained from these methods performed differently when inputted in the KNN classification algorithm. Kassidas, MacGregor and Taylor’s (named KMT) method was deemed the best DTW method for aligning and synchronizing a milk chocolate conching dataset, presenting 93.7% accuracy, 97.2% sensitivity and 90.3% specificity in batch classification, being considered the best option to determine the reference set for the milk chocolate dataset. Such method was recommended due to the lowest number of iterations required to achieve convergence and highest average accuracy in the testing portion using the KNN classification technique.

Keywords: batch process monitoring, chocolate conching, dynamic time warping, reference set distribution, variable duration

Procedia PDF Downloads 169
12863 Organic Carbon Pools Fractionation of Lacustrine Sediment with a Stepwise Chemical Procedure

Authors: Xiaoqing Liu, Kurt Friese, Karsten Rinke

Abstract:

Lacustrine sediment archives rich paleoenvironmental information in lake and surrounding environment. Additionally, modern sediment is used as an effective medium for the monitoring of lake. Organic carbon in sediment is a heterogeneous mixture with varying turnover times and qualities which result from the different biogeochemical processes in the deposition of organic material. Therefore, the isolation of different carbon pools is important for the research of lacustrine condition in the lake. However, the numeric available fractionation procedures can hardly yield homogeneous carbon pools on terms of stability and age. In this work, a multi-step fractionation protocol that treated sediment with hot water, HCl, H2O2 and Na2S2O8 in sequence was adopted, the treated sediment from each step were analyzed for the isotopic and structural compositions with Isotope Ratio Mass Spectrometer coupled with element analyzer (IRMS-EA) and Solid-state 13C Nuclear Magnetic Resonance (NMR), respectively. The sequential extractions with hot-water, HCl, and H2O2 yielded a more homogeneous and C3 plant-originating OC fraction, which was characterized with an atomic C/N ratio shift from 12.0 to 20.8, and 13C and 15N isotopic signatures were 0.9‰ and 1.9‰ more depleted than the original bulk sediment, respectively. Additionally, the H2O2- resistant residue was dominated with stable components, such as the lignins, waxes, cutans, tannins, steroids and aliphatic proteins and complex carbohydrates. 6M HCl in the acid hydrolysis step was much more effective than 1M HCl to isolate a sedimentary OC fraction with higher degree of homogeneity. Owing to the extremely high removal rate of organic matter, the step of a Na2S2O8 oxidation is only suggested if the isolation of the most refractory OC pool is mandatory. We conclude that this multi-step chemical fractionation procedure is effective to isolate more homogeneous OC pools in terms of stability and functional structure, and it can be used as a promising method for OC pools fractionation of sediment or soil in future lake research.

Keywords: 13C-CPMAS-NMR, 13C signature, lake sediment, OC fractionation

Procedia PDF Downloads 303
12862 Endoscopic Versus Open Treatment of Carpal Tunnel Syndrome: Postoperative Complications in Patients on Anticoagulation

Authors: Arman Kishan, Mark Haft, Kiyanna Thomas, Duc Nguyen, Dawn Laporte

Abstract:

Objective: Patients receiving anticoagulation therapy frequently experience increased rates of postoperative complications. Presently, limited data exist regarding the outcomes of patients undergoing carpal tunnel release surgery (CTR) while on anticoagulation. Our objective is to examine and compare the occurrence of complications in patients on anticoagulation who underwent either endoscopic CTR (ECTR) or open CTR (OCTR) for CTS. Methods: The Trinet X database was utilized to retrospectively identify patients who underwent OCTR or ECTR while concurrently on anticoagulation. Demographic data, medical comorbidities, and complication rates were analyzed. We used multivariable analysis to identify differences in postoperative complications, including wound infection within 90 days, wound dehiscence within 90 days, and intraoperative median nerve injury between the two surgical methods in patients on anticoagulation. Results: A total of 10,919 carpal tunnel syndrome patients on anticoagulation were included in the study, with 9082 and 1837 undergoing OCTR and ECTR, respectively. Among patients on anticoagulation, those undergoing ECTR exhibited a significantly lower occurrence of 90-day wound infection (p < 0.001) and nerve injury (p < 0.001) compared to those who underwent OCTR. However, there was no statistically significant difference in the risk of 90-day wound dehiscence between the two groups (p = 0.323). Conclusion:  In prior studies, ECTR demonstrated reduced rates of postoperative complications compared to OCTR in the general population. Our study demonstrates that among patients on anticoagulation, those undergoing ECTR experienced a significantly lower incidence of 90-day wound infection and nerve injury, with risk reductions of 35% and 40%, respectively. These findings support using ECTR as a preferred surgical method for patients with CTS who are on anticoagulation therapy.

Keywords: endoscopic treatment of carpal tunnel syndrome, open treatment of carpal tunnel syndrome, postoperative complications in patients on anticoagulation, carpal tunnel syndrome

Procedia PDF Downloads 73
12861 Impact of Drainage Defect on the Railway Track Surface Deflections; A Numerical Investigation

Authors: Shadi Fathi, Moura Mehravar, Mujib Rahman

Abstract:

The railwaytransportation network in the UK is over 100 years old and is known as one of the oldest mass transit systems in the world. This aged track network requires frequent closure for maintenance. One of the main reasons for closure is inadequate drainage due to the leakage in the buried drainage pipes. The leaking water can cause localised subgrade weakness, which subsequently can lead to major ground/substructure failure.Different condition assessment methods are available to assess the railway substructure. However, the existing condition assessment methods are not able to detect any local ground weakness/damageand provide details of the damage (e.g. size and location). To tackle this issue, a hybrid back-analysis technique based on artificial neural network (ANN) and genetic algorithm (GA) has been developed to predict the substructurelayers’ moduli and identify any soil weaknesses. At first, afinite element (FE) model of a railway track section under Falling Weight Deflection (FWD) testing was developed and validated against field trial. Then a drainage pipe and various scenarios of the local defect/ soil weakness around the buried pipe with various geometriesand physical properties were modelled. The impact of the soil local weaknesson the track surface deflection wasalso studied. The FE simulations results were used to generate a database for ANN training, and then a GA wasemployed as an optimisation tool to optimise and back-calculate layers’ moduli and soil weakness moduli (ANN’s input). The hybrid ANN-GA back-analysis technique is a computationally efficient method with no dependency on seed modulus values. The modelcan estimate substructures’ layer moduli and the presence of any localised foundation weakness.

Keywords: finite element (FE) model, drainage defect, falling weight deflectometer (FWD), hybrid ANN-GA

Procedia PDF Downloads 160
12860 Evaluation of Anti-Typhoid Effects of Azadirachta indica L. Fractions

Authors: A. Adetutu, T. M. Awodugba, O. A. Owoade

Abstract:

The development of resistance to currently known conventional anti-typhoid drugs has necessitated search into cheap, more potent and less toxic anti-typhoid drugs of plant origin. Therefore, this study investigated the anti-typhoid activity of fractions of A. indica in Salmonella typhi infected rats. Leaves of A. indica were extracted in methanol and fractionated into n-hexane, chloroform, ethyl-acetate, and aqueous fractions. The anti-salmonella potentials of fractions of A. indica were assessed via in-vitro inhibition of S. typhi using agar well diffusion, Minimum Inhibitory Concentration (MIC), Minimum Bactericidal Concentration (MBC) and biofilm assays. The biochemical and haematological parameters were determined by spectrophotometric methods. The histological analysis was performed using Haematoxylin and Eosin staining methods. Data analysis was performed by one-way ANOVA. Results of this study showed that S. typhi was sensitive to aqueous and chloroform fractions of A. indica, and the fractions showed biofilm inhibition at concentrations of 12.50, 1.562, and 0.39 mg/mL. In the in-vivo study, the extract and chloroform fraction had significant (p < 0.05) effects on the number of viable S. typhi recovered from the blood and stopped salmonellosis after 6 days of treatment of rats at 500 mg/kg b.w. Treatments of infected rats with chloroform and aqueous fractions of A. indica normalized the haematological parameters in the animals. Similarly, treatment with fractions of the plants sustained a normal antioxidant status when compared with the normal control group. Chloroform and ethyl-acetate fractions of A. indica reversed the liver and intestinal degeneration induced by S. typhi infection in rats. The present investigation indicated that the aqueous and chloroform fractions of A. indica showed the potential to provide an effective treatment for salmonellosis, including typhoid fever. The results of the study may justify the ethno-medicinal use of the extract in traditional medicine for the treatment of typhoid and salmonella infections.

Keywords: Azadirachta indica L, salmonella, typhoid, leave fractions

Procedia PDF Downloads 136
12859 The Effect of Artificial Intelligence on Digital Factory

Authors: Sherif Fayez Lewis Ghaly

Abstract:

up to datefacupupdated planning has the mission of designing merchandise, plant life, procedures, enterprise, regions, and the development of a up to date. The requirements for up-to-date planning and the constructing of a updated have changed in recent years. everyday restructuring is turning inupupdated greater essential up-to-date hold the competitiveness of a manufacturing facilityupdated. restrictions in new regions, shorter existence cycles of product and manufacturing generation up-to-date a VUCA global (Volatility, Uncertainty, Complexity & Ambiguity) up-to-date greater frequent restructuring measures inside a manufacturing facilityupdated. A virtual up-to-date model is the making plans basis for rebuilding measures and up-to-date an fundamental up-to-date. short-time period rescheduling can now not be handled through on-web site inspections and manual measurements. The tight time schedules require 3177227fc5dac36e3e5ae6cd5820dcaa making plans fashions. updated the high variation fee of facup-to-dateries defined above, a method for rescheduling facupdatedries on the idea of a modern-day digital up to datery dual is conceived and designed for sensible software in updated restructuring projects. the point of interest is on rebuild approaches. The purpose is up-to-date preserve the planning basis (virtual up-to-date model) for conversions within a up to datefacupupdated updated. This calls for the application of a methodology that reduces the deficits of present techniques. The goal is up-to-date how a digital up to datery version may be up to date up to date during ongoing up to date operation. a method up-to-date on phoup to dategrammetry technology is presented. the focus is on developing a easy and fee-powerful up to date tune the numerous adjustments that arise in a manufacturing unit constructing in the course of operation. The method is preceded with the aid of a hardware and software assessment up-to-date become aware of the most cost effective and quickest version.

Keywords: building information modeling, digital factory model, factory planning, maintenance digital factory model, photogrammetry, restructuring

Procedia PDF Downloads 33
12858 Relearning to Learn: Approaching Sustainability by Incorporating Inuit Vernacular and Biomimicry Architecture Principles

Authors: Hakim Herbane

Abstract:

Efforts to achieve sustainability in architecture must prove their effectiveness despite various methods attempted. Biomimicry, which looks to successful natural models to promote sustainability and innovation, faces obstacles in implementing sustainability despite its restorative approach to the relationship between humans and nature. In Nunavik, Inuit communities are exploring a sustainable production system that aligns with their aspirations and meets their demands of human, technological, technical, economic, and ecological factors. Biomimicry holds promise in line with Inuit philosophy, but its failure to implement sustainability requires further investigations to remedy its deficiencies. Our literature review underscores the importance of involving the community in defining sustainability and determining the best methods for its implementation. Additionally, vernacular architecture shows valuable orientations for achieving sustainability. Moreover, reintegrating Inuit communities and their traditional architectural practices, which have successfully balanced their built environment's diverse needs and constraints, could pave the way for a sustainable Inuit-built environment in Nunavik and advance architectural biomimicry principles simultaneously. This research aims at establishing a sustainability monitoring tool for Nordic architectural process by analyzing Inuit vernacular and biomimetic architecture, in addition to the input of stakeholders involved in Inuit architecture production in Nunavik, especially Inuit. The goal is to create a practical tool (an index) to aid in designing sustainable architecture, taking into account environmental, social, and economic perspectives. Furthermore, the study seeks to authenticate strong, sustainable design principles of vernacular and biomimetic architectures. The literature review uncovered challenges and identified new opportunities. The forthcoming discourse will focus on the careful and considerate incorporation of Inuit communities’ perceptions and indigenous building practices into our methodology and the latest findings of our research.

Keywords: sustainability, biomimicry, vernacular architecture, community involvement

Procedia PDF Downloads 57
12857 An Exploratory Study on the Integration of Neurodiverse University Students into Mainstream Learning and Their Performance: The Case of the Jones Learning Center

Authors: George Kassar, Phillip A. Cartwright

Abstract:

Based on data collected from The Jones Learning Center (JLC), University of the Ozarks, Arkansas, U.S., this study explores the impact of inclusive classroom practices on neuro-diverse college students’ and their consequent academic performance having participated in integrative therapies designed to support students who are intellectually capable of obtaining a college degree, but who require support for learning challenges owing to disabilities, AD/HD, or ASD. The purpose of this study is two-fold. The first objective is to explore the general process, special techniques, and practices of the (JLC) inclusive program. The second objective is to identify and analyze the effectiveness of the processes, techniques, and practices in supporting the academic performance of enrolled college students with learning disabilities following integration into mainstream university learning. Integrity, transparency, and confidentiality are vital in the research. All questions were shared in advance and confirmed by the concerned management at the JLC. While administering the questionnaire as well as conducted the interviews, the purpose of the study, its scope, aims, and objectives were clearly explained to all participants prior starting the questionnaire / interview. Confidentiality of all participants assured and guaranteed by using encrypted identification of individuals, thus limiting access to data to only the researcher, and storing data in a secure location. Respondents were also informed that their participation in this research is voluntary, and they may withdraw from it at any time prior to submission if they wish. Ethical consent was obtained from the participants before proceeding with videorecording of the interviews. This research uses a mixed methods approach. The research design involves collecting, analyzing, and “mixing” quantitative and qualitative methods and data to enable a research inquiry. The research process is organized based on a five-pillar approach. The first three pillars are focused on testing the first hypothesis (H1) directed toward determining the extent to the academic performance of JLC students did improve after involvement with comprehensive JLC special program. The other two pillars relate to the second hypothesis (H2), which is directed toward determining the extent to which collective and applied knowledge at JLC is distinctive from typical practices in the field. The data collected for research were obtained from three sources: 1) a set of secondary data in the form of Grade Point Average (GPA) received from the registrar, 2) a set of primary data collected throughout structured questionnaire administered to students and alumni at JLC, and 3) another set of primary data collected throughout interviews conducted with staff and educators at JLC. The significance of this study is two folds. First, it validates the effectiveness of the special program at JLC for college-level students who learn differently. Second, it identifies the distinctiveness of the mix of techniques, methods, and practices, including the special individualized and personalized one-on-one approach at JLC.

Keywords: education, neuro-diverse students, program effectiveness, Jones learning center

Procedia PDF Downloads 77
12856 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors

Authors: Susana Aragoneses Garrido

Abstract:

Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.

Keywords: analysis vehicles, asssesment, ergonomics, car redesign

Procedia PDF Downloads 342