Search results for: software process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18623

Search results for: software process

16013 Random Vertical Seismic Vibrations of the Long Span Cantilever Beams

Authors: Sergo Esadze

Abstract:

Seismic resistance norms require calculation of cantilevers on vertical components of the base seismic acceleration. Long span cantilevers, as a rule, must be calculated as a separate construction element. According to the architectural-planning solution, functional purposes and environmental condition of a designing buildings/structures, long span cantilever construction may be of very different types: both by main bearing element (beam, truss, slab), and by material (reinforced concrete, steel). A choice from these is always linked with bearing construction system of the building. Research of vertical seismic vibration of these constructions requires individual approach for each (which is not specified in the norms) in correlation with model of seismic load. The latest may be given both as deterministic load and as a random process. Loading model as a random process is more adequate to this problem. In presented paper, two types of long span (from 6m – up to 12m) reinforcement concrete cantilever beams have been considered: a) bearing elements of cantilevers, i.e., elements in which they fixed, have cross-sections with large sizes and cantilevers are made with haunch; b) cantilever beam with load-bearing rod element. Calculation models are suggested, separately for a) and b) types. They are presented as systems with finite quantity degree (concentrated masses) of freedom. Conditions for fixing ends are corresponding with its types. Vertical acceleration and vertical component of the angular acceleration affect masses. Model is based on assumption translator-rotational motion of the building in the vertical plane, caused by vertical seismic acceleration. Seismic accelerations are considered as random processes and presented by multiplication of the deterministic envelope function on stationary random process. Problem is solved within the framework of the correlation theory of random process. Solved numerical examples are given. The method is effective for solving the specific problems.

Keywords: cantilever, random process, seismic load, vertical acceleration

Procedia PDF Downloads 177
16012 Optimization of Monascus Orange Pigments Production Using pH-Controlled Fed-Batch Fermentation

Authors: Young Min Kim, Deokyeong Choe, Chul Soo Shin

Abstract:

Monascus pigments, commonly used as a natural colorant in Asia, have many biological activities, such as cholesterol level control, anti-obesity, anti-cancer, and anti-oxidant, that have recently been elucidated. Especially, amino acid derivatives of Monascus pigments are receiving much attention because they have higher biological activities than original Monascus pigments. Previously, there have been two ways to produce amino acid derivatives: one-step production and two-step production. However, the one-step production has low purity, and the two-step production—precursor(orange pigments) fermentation and derivatives synthesis—has low productivity and growth rate during its precursor fermentation step. In this study, it was verified that pH is a key factor that affects the stability of orange pigments and the growth rate of Monascus. With an optimal pH profile obtained by pH-stat fermentation, we designed a process of precursor(orange pigments) fermentation that is a pH-controlled fed-batch fermentation. The final concentration of orange pigments in this process increased to 5.5g/L which is about 30% higher than the concentration produced from the previously used precursor fermentation step.

Keywords: cultivation process, fed-batch fermentation, monascus pigments, pH stability

Procedia PDF Downloads 290
16011 Energy-Led Sustainability Assessment Approach for Energy-Efficient Manufacturing

Authors: Aldona Kluczek

Abstract:

In recent years, manufacturing processes have interacted with sustainability issues realized in the cost-effective ways that minimalize energy, decrease negative impacts on the environment and are safe for society. However, the attention has been on separate sustainability assessment methods considering energy and material flow, energy consumption, and emission release or process control. In this paper, the energy-led sustainability assessment approach combining the methods: energy Life Cycle Assessment to assess environmental impact, Life Cycle Cost to analyze costs, and Social Life Cycle Assessment through ‘energy LCA-based value stream map’, is used to assess the energy sustainability of the hardwood lumber manufacturing process in terms of technologies. The approach integrating environmental, economic and social issues can be visualized in the considered energy-efficient technologies on the map of an energy LCA-related (input and output) inventory data. It will enable the identification of efficient technology of a given process to be reached, through the effective analysis of energy flow. It is also indicated that interventions in the considered technology should focus on environmental, economic improvements to achieve energy sustainability. The results have indicated that the most intense energy losses are caused by a cogeneration technology. The environmental impact analysis shows that a substantial reduction by 34% can be achieved with the improvement of it. From the LCC point of view, the result seems to be cost-effective, when done at that plant where the improvement is used. By demonstrating the social dimension, every component of the energy of plant labor use in the life-cycle process of the lumber production has positive energy benefits. The energy required to install the energy-efficient technology amounts to 30.32 kJ compared to others components of the energy of plant labor and it has the highest value in terms of energy-related social indicators. The paper depicts an example of hardwood lumber production in order to prove the applicability of a sustainability assessment method.

Keywords: energy efficiency, energy life cycle assessment, life cycle cost, social life cycle analysis, manufacturing process, sustainability assessment

Procedia PDF Downloads 237
16010 Re-Engineering Management Process in IRAN’s Smart Schools

Authors: M. R. Babaei, S. M. Hosseini, S. Rahmani, L. Moradi

Abstract:

Today, the quality of education and training systems and the effectiveness of the education systems of most concern to stakeholders and decision-makers of our country's development in each country. In Iran this is a double issue of concern to numerous reasons; So that governments, over the past decade have hardly even paid the running costs of education. ICT is claiming it has the power to change the structure of a program for training, reduce costs and increase quality, and do education systems and products consistent with the needs of the community and take steps to practice education. Own of the areas that the introduction of information technology has fundamentally changed is the field of education. The aim of this research is process reengineering management in schools simultaneously has been using field studies to collect data in the form of interviews and a questionnaire survey. The statistical community of this research has been the country of Iran and smart schools under the education. Sampling was targeted. The data collection tool was a questionnaire composed of two parts. The questionnaire consists of 36 questions that each question designates one of effective factors on the management of smart schools. Also each question consists of two parts. The first part designates the operating position in the management process, which represents the domain's belonging to the management agent (planning, organizing, leading, controlling). According to the classification of Dabryn and in second part the factors affect the process of managing the smart schools were examined, that Likert scale is used to classify. Questions the validity of the group of experts and prominent university professors in the fields of information technology, management and reengineering of approved and Cronbach's alpha reliability and also with the use of the formula is evaluated and approved. To analyse the data, descriptive and inferential statistics were used to analyse the factors contributing to the rating of (Linkert scale) descriptive statistics (frequency table data, mean, median, mode) was used. To analyse the data using analysis of variance and nonparametric tests and Friedman test, the assumption was evaluated. The research conclusions show that the factors influencing the management process re-engineering smart schools in school performance is affected.

Keywords: re-engineering, management process, smart school, Iran's school

Procedia PDF Downloads 233
16009 Digital System Design for Strategic Improvement Planning in Education: A Socio-Technical and Iterative Design Approach

Authors: Neeley Current, Fatih Demir, Kenneth Haggerty, Blake Naughton, Isa Jahnke

Abstract:

Educational systems seek reform using data-intensive continuous improvement processes known as strategic improvement plans (SIPs). Schools turn to digital systems to monitor, analyze and report SIPs. One technical challenge of these digital systems focuses on integrating a highly diverse set of data sources. Another challenge is to create a learnable sociotechnical system to help administrators, principals and teachers add, manipulate and interpret data. This study explores to what extent one particular system is usable and useful for strategic planning activities and whether intended users see the benefit of the system achieve the goal of improving workflow related to strategic planning in schools. In a three-phase study, researchers used sociotechnical design methods to understand the current workflow, technology use, and processes of teachers and principals surrounding their strategic improvement planning. Additionally, design review and task analysis usability methods were used to evaluate task completion, usability, and user satisfaction of the system. The resulting socio-technical models illustrate the existing work processes and indicate how and at which places in the workflow the newly developed system could have an impact. The results point to the potential of the system but also indicate that it was initially too complicated for use. However, the diverse users see the potential benefits, especially to overcome the diverse set of data sources, and that the system could fill a gap for schools in planning and conducting strategic improvement plans.

Keywords: continuous improvement process, education reform, strategic improvement planning, sociotechnical design, software development, usability

Procedia PDF Downloads 290
16008 Road Maintenance Management Decision System Using Multi-Criteria and Geographical Information System for Takoradi Roads, Ghana

Authors: Eric Mensah, Carlos Mensah

Abstract:

The road maintenance backlogs created as a result of deferred maintenance especially in developing countries has caused considerable deterioration of many road assets. This is usually due to difficulties encountered in selecting and prioritising maintainable roads based on objective criteria rather than some political or other less important criteria. In order to ensure judicious use of limited resources for road maintenance, five factors were identified as the most important criteria for road management within the study area. This was based on the judgements of 40 experts. The results were further used to develop weightings using the Multi-Criteria Decision Process (MCDP) to analyse and select road alternatives according to maintenance goal. Using Geographical Information Systems (GIS), maintainable roads were grouped using the Jenk’s natural breaks to allow for further prioritised in order of importance for display on a dashboard of maps, charts, and tables. This reduces the problems of subjective maintenance and road selections, thereby reducing wastage of resources and easing the maintenance process through an object organised spatial decision support system.

Keywords: decision support, geographical information systems, multi-criteria decision process, weighted sum

Procedia PDF Downloads 361
16007 Dynamic Process Model for Designing Smart Spaces Based on Context-Awareness and Computational Methods Principles

Authors: Heba M. Jahin, Ali F. Bakr, Zeyad T. Elsayad

Abstract:

As smart spaces can be defined as any working environment which integrates embedded computers, information appliances and multi-modal sensors to remain focused on the interaction between the users, their activity, and their behavior in the space; hence, smart space must be aware of their contexts and automatically adapt to their changing context-awareness, by interacting with their physical environment through natural and multimodal interfaces. Also, by serving the information used proactively. This paper suggests a dynamic framework through the architectural design process of the space based on the principles of computational methods and context-awareness principles to help in creating a field of changes and modifications. It generates possibilities, concerns about the physical, structural and user contexts. This framework is concerned with five main processes: gathering and analyzing data to generate smart design scenarios, parameters, and attributes; which will be transformed by coding into four types of models. Furthmore, connecting those models together in the interaction model which will represent the context-awareness system. Then, transforming that model into a virtual and ambient environment which represents the physical and real environments, to act as a linkage phase between the users and their activities taking place in that smart space . Finally, the feedback phase from users of that environment to be sure that the design of that smart space fulfill their needs. Therefore, the generated design process will help in designing smarts spaces that can be adapted and controlled to answer the users’ defined goals, needs, and activity.

Keywords: computational methods, context-awareness, design process, smart spaces

Procedia PDF Downloads 309
16006 Research of Actuators of Common Rail Injection Systems with the Use of LabVIEW on a Specially Designed Test Bench

Authors: G. Baranski, A. Majczak, M. Wendeker

Abstract:

Currently, the most commonly used solution to provide fuel to the diesel engines is the Common Rail system. Compared to previous designs, as a due to relatively simple construction and electronic control systems, these systems allow achieving favourable engine operation parameters with particular emphasis on low emission of toxic compounds into the atmosphere. In this system, the amount of injected fuel dose is strictly dependent on the course of parameters of the electrical impulse sent by the power amplifier power supply system injector from the engine controller. The article presents the construction of a laboratory test bench to examine the course of the injection process and the expense in storage injection systems. The test bench enables testing of injection systems with electromagnetically controlled injectors with the use of scientific engineering tools. The developed system is based on LabView software and CompactRIO family controller using FPGA systems and a real time microcontroller. The results of experimental research on electromagnetic injectors of common rail system, controlled by a dedicated National Instruments card, confirm the effectiveness of the presented approach. The results of the research described in the article present the influence of basic parameters of the electric impulse opening the electromagnetic injector on the value of the injected fuel dose. Acknowledgement: This work has been realized in the cooperation with The Construction Office of WSK ‘PZL-KALISZ’ S.A.’ and is part of Grant Agreement No. POIR.01.02.00-00-0002/15 financed by the Polish National Centre for Research and Development.

Keywords: fuel injector, combustion engine, fuel pressure, compression ignition engine, power supply system, controller, LabVIEW

Procedia PDF Downloads 117
16005 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.

Keywords: CT simulator, radiotherapy, quality control, QA programme

Procedia PDF Downloads 517
16004 The Implementation of the European Landscape Convention in Turkey: Opportunities and Constraints

Authors: Tutku Ak, Abdullah Kelkit, Cihad Öztürk

Abstract:

An increase has been witnessed with the number of multinational environmental agreements in the past decade, particularly in Europe. Success with implementation, however, shows variation. While many countries are willing to join these agreements, they do not always fully honor their obligations to put their commitments into practice. One reason for this is that countries have different legal and administrative systems. One example of an international multilateral environmental agreement is the European Landscape Convention (ELC). ELC expresses a concern to achieve sustainable development based on a balanced and harmonious relationship between social needs, economic activity, and the environment. Member states are required to implement the convention in accordance with their own administrative structure, respecting subsidiarity. In particular, the importance of cooperation in the protection, management, and planning of the resources is expressed through the convention. In this paper, it is intended to give a broad view of ELC’s implementation process in Turkey and what factors have influenced by the process. Under this context, the paper will focus on the objectives of the convention for addressing the issue of the loss of European landscapes, and the justification and tools used to accomplish these objectives. The degree to which these objectives have been implemented in Turkey and the opportunities and constraints that have been faced during this process have been discussed.

Keywords: European landscape convention, implementation, multinational environmental agreements, policy tools

Procedia PDF Downloads 291
16003 Effect of Equal Channel Angular Pressing Process on Impact Property of Pure Copper

Authors: Fahad Al-Mufadi, F. Djavanroodi

Abstract:

Ultrafine grained (UFG) and nanostructured (NS) materials have experienced a rapid development during the last decade and made profound impact on every field of materials science and engineering. The present work has been undertaken to develop ultra-fine grained pure copper by severe plastic deformation method and to examine the impact property by different characterizing tools. For this aim, equal channel angular pressing die with the channel angle, outer corner angle and channel diameter of 90°, 17° and 20 mm had been designed and manufactured. Commercial pure copper billets were ECAPed up to four passes by route BC at the ambient temperature. The results indicated that there is a great improvement at the hardness measurement, yield strength and ultimate tensile strength after ECAP process. It is found that the magnitudes of HV reach 136HV from 52HV after the final pass. Also, about 285% and 125% enhancement at the YS and UTS values have been obtained after the fourth pass as compared to the as-received conditions, respectively. On the other hand, the elongation to failure and impact energy have been reduced by imposing ECAP process and pass numbers. It is needed to say that about 56% reduction in the impact energy have been attained for the samples as contrasted to annealed specimens.

Keywords: SPD, ECAP, pure cu, impact property

Procedia PDF Downloads 249
16002 Use of computer and peripherals in the Archaeological Surveys of Sistan in Eastern Iran

Authors: Mahyar Mehrafarin, Reza Mehrafarin

Abstract:

The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.

Keywords: archaeological surveys, computer use, iran, modern technologies, sistan

Procedia PDF Downloads 65
16001 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets

Authors: Ece Cigdem Mutlu, Burak Alakent

Abstract:

Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.

Keywords: average run length, M-estimators, quality control, robust estimators

Procedia PDF Downloads 178
16000 Nanda Ways of Knowing, Being and Doing: Our Process of Research Engagement and Research Impacts

Authors: Steven Kelly

Abstract:

A fundament role of the researcher is research engagement, that is, the interaction between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods, or resources. While research impact is the contribution that research makes to the economy, society, environment, or culture beyond the contribution to academic research. Ironically, traditional impact metrics in the academy are designed to focus on the outputs; it dismisses the important role engagement plays in fostering a collaborative process that leads to meaningful, ethical, and useful impacts. Dr. Kelly, aNanda (First Nations) man himself, has worked closely with the Nanda community over the past decade, ensuring cultural protocols are upheld and implemented while doing research engagement. The focus was on the process, which was essential to foster a positive research impact culture. The contributions that flowed from this process were the naming of a new species of squat lobster in the Nanda language, a poster design in collaboration with The University of Melbourne, Museums Victoria and Bundiyarra - IrraWanga language centre, media coverage, and the formation of the “Nanda language, Nanda country project”. The Nanda language, Nanda country project is a language revitalization project that focused on reconnecting Nanda people with the language & culture on Nanda Country. Such outcomes are imperative on the eve of the United Nations International Decade of Indigenous Languages. In this paperDr, Kellywill discuss howNanda cultural practicesinformed research engagement to foster a collaborative processthat, in turn, ledto meaningful, ethical, and useful impacts within and outside of the academy.

Keywords: community collaboration, indigenous, nanda, research engagement, research impacts

Procedia PDF Downloads 104
15999 The Automatic Transliteration Model of Images of the Book Hamong Tani Using Statistical Approach

Authors: Agustinus Rudatyo Himamunanto, Anastasia Rita Widiarti

Abstract:

Transliteration using Javanese manuscripts is one of methods to preserve and legate the wealth of literature in the past for the present generation in Indonesia. The transliteration manual process commonly requires philologists and takes a relatively long time. The automatic transliteration process is expected to shorten the time so as to help the works of philologists. The preprocessing and segmentation stage firstly done is used to manage the document images, thus obtaining image script units that will compile input document images free from noise and have the similarity in properties in the thickness, size, and slope. The next stage of characteristic extraction is used to find unique characteristics that will distinguish each Javanese script image. One of characteristics that is used in this research is the number of black pixels in each image units. Each image of Java scripts contained in the data training will undergo the same process similar to the input characters. The system testing was performed with the data of the book Hamong Tani. The book Hamong Tani was selected due to its content, age and number of pages. Those were considered sufficient as a model experimental input. Based on the results of random page automatic transliteration process testing, it was determined that the maximum percentage correctness obtained was 81.53%. The percentage of success was obtained in 32x32 pixel input image size with the 5x5 image window. With regard to the results, it can be concluded that the automatic transliteration model offered is relatively good.

Keywords: Javanese script, character recognition, statistical, automatic transliteration

Procedia PDF Downloads 330
15998 Ensuring Safe Operation by Providing an End-To-End Field Monitoring and Incident Management Approach for Autonomous Vehicle Based on ML/Dl SW Stack

Authors: Lucas Bublitz, Michael Herdrich

Abstract:

By achieving the first commercialization approval in San Francisco the Autonomous Driving (AD) industry proves the technology maturity of the SAE L4 AD systems and the corresponding software and hardware stack. This milestone reflects the upcoming phase in the industry, where the focus is now about scaling and supervising larger autonomous vehicle (AV) fleets in different operation areas. This requires an operation framework, which organizes and assigns responsibilities to the relevant AV technology and operation stakeholders from the AV system provider, the Remote Intervention Operator, the MaaS provider and regulatory & approval authority. This holistic operation framework consists of technological, processual, and organizational activities to ensure safe operation for fully automated vehicles. Regarding the supervision of large autonomous vehicle fleets, a major focus is on the continuous field monitoring. The field monitoring approach must reflect the safety and security criticality of incidents in the field during driving operation. This includes an automatic containment approach, with the overall goal to avoid safety critical incidents and reduce downtime by a malfunction of the AD software stack. An End-to-end (E2E) field monitoring approach detects critical faults in the field, uses a knowledge-based approach for evaluating the safety criticality and supports the automatic containment of these E/E faults. Applying such an approach will ensure the scalability of AV fleets, which is determined by the handling of incidents in the field and the continuous regulatory compliance of the technology after enhancing the Operational Design Domain (ODD) or the function scope by Functions on Demand (FoD) over the entire digital product lifecycle.

Keywords: field monitoring, incident management, multicompliance management for AI in AD, root cause analysis, database approach

Procedia PDF Downloads 56
15997 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 557
15996 Towards a Comprehensive Framework on Civic Competence Development of Teachers: A Systematic Review of Literature

Authors: Emilie Vandevelde, Ellen Claes

Abstract:

This study aims to develop a comprehensive model for the civic socialization process of teachers. Citizenship has become one of the main objectives for the European education systems. It is expected that teachers are well prepared and equipped with the necessary knowledge, skills, and attitudes to also engage students in democratic citizenship. While a lot is known about young peoples’ civic competence development and how schools and teachers (don’t) support this process, less is known about how teachers themselves engage with (the teaching of) civics. Other than the civic socialization process of young adolescents that focuses on personal competence development, the civic socialization process of teachers includes the development of professional, civic competences. These professional competences make that they are able to prepare pupils to carry out their civic responsibilities in thoughtful ways. Existing models for the civic socialization process of young adolescents do not take this dual purpose into account. Based on these observations, this paper will investigate (1)What personal and professional civic competences teachers need to effectively teach civic education and (2) how teachers acquire these personal and professional civic competences. To answer the first research question, a systematic review of literature of existing civic education frameworks was carried out and linked to literature on teacher training. The second research question was addressed by adapting the Octagon model, developed by the International Association for the Evaluation of Educational Achievement (IEA), to the context of teachers. This was done by carrying out a systematic review of the recent literature linking three theoretical topics involved in teachers’ civic competence development: theories about the civic socialization process of young adolescents, Schulmans (1987) theoretical assumptions on pedagogical content knowledge (PCK), and Nogueira & Moreira’s (2012) framework for civic education teachers’ knowledge and literature on teachers’ professional development. This resulted in a comprehensive conceptual framework describing the personal and professional civic competences of civic education teachers. In addition, this framework is linked to the OctagonT model: a model that describes the processes through which teachers acquire these personal and professional civic competences. This model recognizes that teachers’ civic socialization process is influenced by interconnected variables located at different levels in a multi-level structure (the individual teacher (e.g., civic beliefs), everyday contacts (e.g., teacher educators, the intended, informal and hidden curriculum of the teacher training program, internship contacts, participation opportunities in teacher training, etc.) and the influence of the national educational context (e.g., vision on civic education)). Furthermore, implications for teacher education programs are described.

Keywords: civic education, civic competences, civic socialization, octagon model, teacher training

Procedia PDF Downloads 257
15995 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit

Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira

Abstract:

Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.

Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing

Procedia PDF Downloads 135
15994 Preparation Non-Woven Nanofiber Structures for Uniform and Rapid Drug Releasing Applications Using an Electrospinning Process

Authors: Cho-Liang Chung

Abstract:

Uniform and rapid drug release are important for trauma dressing application. Low glass transition polymer system and non-woven nanofiber structures as the designs conduct rapid-release characteristics. In this study, polyvinylpyrrolidone, polysulfone, and polystyrene were dissolved in dimethylformamide to form precursor solution. These solutions were blended with vitamin C to form the electrospinning solutions. The non-woven nanofibers structures were successfully prepared using an electrospinning process. The following instruments were used to analyze the characteristics of non-woven nanofibers structures: Atomic force microscopy (AFM), Field Emission Scanning Electron Microscope (FE-SEM), and X-ray Diffraction (XRD). The AFM was used to scan the nanofibers. 3D Graphics were applied to explore the surface morphology of nanofibers. FE-SEM was used to explore the morphology of non-woven structures. XRD was used to identify crystal structures in the non-woven structures. The evolution of morphology of non-woven structures was changed dramatically in different durations, because of the moisture absorption and decreasing glass transition temperature; the non-woven nanofiber structures can be applied to uniform and rapid drug release for trauma dressing application.

Keywords: nanofibers, non-woven, electrospinning process, rapid drug releasing

Procedia PDF Downloads 129
15993 Synthesis and Characterization of PVDF, FG, PTFE, and PES Membrane Distillation Modified with Silver Nanoparticles

Authors: Lopez J., Mehrvar M., Quinones E., Suarez A., RomeroC.

Abstract:

The Silver Nanoparticles (AgNP) are used as deliver of heat on surface of Membrane Distillation in order to fight against Thermal Polarization and improving the Desalination Process. In this study AgNPwere deposited by dip coating process over PVDF, FG hydrophilic, and PTFE hydrophobic commercial membranes as substrate. Membranes were characterized by SEM, EDS, contact angle, Pore size distributionand using a UV lamp and a thermal camera were measured the performance of heat deliver. The presence of AgNP 50 – 150 nm and the increase in absorption of energy over membrane were verified.

Keywords: silver nanoparticles, membrane distillation, plasmon effect, heat deliver

Procedia PDF Downloads 110
15992 Electrochemical Regeneration of GIC Adsorbent in a Continuous Electrochemical Reactor

Authors: S. N. Hussain, H. M. A. Asghar, H. Sattar, E. P. L. Roberts

Abstract:

Arvia™ introduced a novel technology consisting of adsorption followed by electrochemical regeneration with a graphite intercalation compound adsorbent that takes place in a single unit. The adsorbed species may lead to the formation of intermediate by-products products due to incomplete mineralization during electrochemical regeneration. Therefore, the investigation of breakdown products due to incomplete oxidation is of great concern regarding the commercial applications of this process. In the present paper, the formation of the chlorinated breakdown products during continuous process of adsorption and electrochemical regeneration based on a graphite intercalation compound adsorbent has been investigated.

Keywords: GIC, adsorption, electrochemical regeneration, chlorphenols

Procedia PDF Downloads 296
15991 Enhanced Solar-Driven Evaporation Process via F-Mwcnts/Pvdf Photothermal Membrane for Forward Osmosis Draw Solution Recovery

Authors: Ayat N. El-Shazly, Dina Magdy Abdo, Hamdy Maamoun Abdel-Ghafar, Xiangju Song, Heqing Jiang

Abstract:

Product water recovery and draw solution (DS) reuse is the most energy-intensive stage in forwarding osmosis (FO) technology. Sucrose solution is the most suitable DS for FO application in food and beverages. However, sucrose DS recovery by conventional pressure-driven or thermal-driven concentration techniques consumes high energy. Herein, we developed a spontaneous and sustainable solar-driven evaporation process based on a photothermal membrane for the concentration and recovery of sucrose solution. The photothermal membrane is composed of multi-walled carbon nanotubes (f-MWCNTs)photothermal layer on a hydrophilic polyvinylidene fluoride (PVDF) substrate. The f-MWCNTs photothermal layer with a rough surface and interconnected network structures not only improves the light-harvesting and light-to-heat conversion performance but also facilitates the transport of water molecules. The hydrophilic PVDF substrate can promote the rapid transport of water for adequate water supply to the photothermal layer. As a result, the optimized f-MWCNTs/PVDF photothermal membrane exhibits an excellent light absorption of 95%, and a high surface temperature of 74 °C at 1 kW m−2 . Besides, it realizes an evaporation rate of 1.17 kg m−2 h−1 for 5% (w/v) of sucrose solution, which is about 5 times higher than that of the natural evaporation. The designed photothermal evaporation process is capable of concentrating sucrose solution efficiently from 5% to 75% (w/v), which has great potential in FO process and juice concentration.

Keywords: solar, pothothermal, membrane, MWCNT

Procedia PDF Downloads 88
15990 The Effect of Foundation on the Earth Fill Dam Settlement

Authors: Masoud Ghaemi, Mohammadjafar Hedayati, Faezeh Yousefzadeh, Hoseinali Heydarzadeh

Abstract:

Careful monitoring in the earth dams to measure deformation caused by settlement and movement has always been a concern for engineers in the field. In order to measure settlement and deformation of earth dams, usually, the precision instruments of settlement set and combined Inclinometer that is commonly referred to IS instrument will be used. In some dams, because the thickness of alluvium is high and there is no possibility of alluvium removal (technically and economically and in terms of performance), there is no possibility of placing the end of IS instrument (precision instruments of Inclinometer-settlement set) in the rock foundation. Inevitably, have to accept installing pipes in the weak and deformable alluvial foundation that leads to errors in the calculation of the actual settlement (absolute settlement) in different parts of the dam body. The purpose of this paper is to present new and refine criteria for predicting settlement and deformation in earth dams. The study is based on conditions in three dams with a deformation quite alluvial (Agh Chai, Narmashir and Gilan-e Gharb) to provide settlement criteria affected by the alluvial foundation. To achieve this goal, the settlement of dams was simulated by using the finite difference method with FLAC3D software, and then the modeling results were compared with the reading IS instrument. In the end, the caliber of the model and validate the results, by using regression analysis techniques and scrutinized modeling parameters with real situations and then by using MATLAB software and CURVE FITTING toolbox, new criteria for the settlement based on elasticity modulus, cohesion, friction angle, the density of earth dam and the alluvial foundation was obtained. The results of these studies show that, by using the new criteria measures, the amount of settlement and deformation for the dams with alluvial foundation can be corrected after instrument readings, and the error rate in reading IS instrument can be greatly reduced.

Keywords: earth-fill dam, foundation, settlement, finite difference, MATLAB, curve fitting

Procedia PDF Downloads 177
15989 Fluidised Bed Gasification of Multiple Agricultural Biomass-Derived Briquettes

Authors: Rukayya Ibrahim Muazu, Aiduan Li Borrion, Julia A. Stegemann

Abstract:

Biomass briquette gasification is regarded as a promising route for efficient briquette use in energy generation, fuels and other useful chemicals, however, previous research work has focused on briquette gasification in fixed bed gasifiers such as updraft and downdraft gasifiers. Fluidised bed gasifier has the potential to be effectively sized for medium or large scale. This study investigated the use of fuel briquettes produced from blends of rice husks and corn cobs biomass residues, in a bubbling fluidised bed gasifier. The study adopted a combination of numerical equations and Aspen Plus simulation software to predict the product gas (syngas) composition based on briquette's density and biomass composition (blend ratio of rice husks to corn cobs). The Aspen Plus model was based on an experimentally validated model from the literature. The results based on a briquette size of 32 mm diameter and relaxed density range of 500 to 650 kg/m3 indicated that fluidisation air required in the gasifier increased with an increase in briquette density, and the fluidisation air showed to be the controlling factor compared with the actual air required for gasification of the biomass briquettes. The mass flowrate of CO2 in the predicted syngas composition, increased with an increase in the air flow rate, while CO production decreased and H2 was almost constant. The H2/CO ratio for various blends of rice husks and corn cobs did not significantly change at the designed process air, but a significant difference of 1.0 for H2/CO ratio was observed at higher air flow rate, and between 10/90 to 90/10 blend ratio of rice husks to corn cobs. This implies the need for further understanding of biomass variability and hydrodynamic parameters on syngas composition in biomass briquette gasification.

Keywords: aspen plus, briquettes, fluidised bed, gasification, syngas

Procedia PDF Downloads 442
15988 The Assesment of Animal Welfare at Slaughterhouses in Badung District, Bali Province

Authors: Ulil Afidah, Mustopa

Abstract:

The study aims to determine the assessment of animal welfare at slaughterhouses in Badung district, Bali province. The study was conducted for ten days with observed five cattle per day with a total 50 cattle. Observation begins when a cow came out of the pick up to be slaughtered, subsequently recorded in a questionnaire that has been provided.The result of the observation showed that the slaughterhouses in Bandung district have the implemented animal welfare which fulfills the requirement that is 63% before slaughtering process, and 76% at slaughtering process. Based on these results it can be concluded in slaughterhouses of Badung district already fulfill the requirements.

Keywords: animal welfare, assesment, Badung district, slaughterhousess

Procedia PDF Downloads 266
15987 Studying the Effect of Ethanol and Operating Temperature on Purification of Lactulose Syrup Containing Lactose

Authors: N. Zanganeh, M. Zabet

Abstract:

Lactulose is a synthetic disaccharide which has remarkable applications in food and pharmaceutical fields. Lactulose is not found in nature and it is produced by isomerization reaction of lactose in an alkaline environment. It should be noted that this reaction has a very low yield since significant amount of lactose stays un-reacted in the system. Basically, purification of lactulose is difficult and costly. Previous studies have revealed that solubility of lactose and lactulose are significantly different in ethanol. Considering the fact that solubility is also affected by temperature itself, we investigated the effect of ethanol and temperature on separation process of lactose from the syrup containing lactose and lactulose. For this purpose, a saturated solution containing lactulose and lactose was made at three different temperatures; 25⁰C (room temperature), 31⁰C, and 37⁰C first.  Five samples containing 2g saturated solution was taken and then 2g, 3g, 4g, 5g, and 6g ethanol separately was added to the sampling tubes. Sampling tubes were kept at respective temperatures afterward. The concentration of lactose and lactulose after separation process measured and analyzed by High Performance Liquid Chromatography (HPLC). Results showed that ethanol has such a greater impact than operating temperature on purification process. Also, it was observed that the maximum rate of separation occurred at initial amount of added ethanol.

Keywords: lactulose, lactose, purification, solubility

Procedia PDF Downloads 443
15986 The Potential in the Use of Building Information Modelling and Life-Cycle Assessment for Retrofitting Buildings: A Study Based on Interviews with Experts in Both Fields

Authors: Alex Gonzalez Caceres, Jan Karlshøj, Tor Arvid Vik

Abstract:

Life cycle of residential buildings are expected to be several decades, 40% of European residential buildings have inefficient energy conservation measure. The existing building represents 20-40% of the energy use and the CO₂ emission. Since net zero energy buildings are a short-term goal, (should be achieved by EU countries after 2020), is necessary to plan the next logical step, which is to prepare the existing outdated stack of building to retrofit them into an energy efficiency buildings. In order to accomplish this, two specialize and widespread tool can be used Building Information Modelling (BIM) and life-cycle assessment (LCA). BIM and LCA are tools used by a variety of disciplines; both are able to represent and analyze the constructions in different stages. The combination of these technologies could improve greatly the retrofitting techniques. The incorporation of the carbon footprint, introducing a single database source for different material analysis. To this is added the possibility of considering different analysis approaches such as costs and energy saving. Is expected with these measures, enrich the decision-making. The methodology is based on two main activities; the first task involved the collection of data this is accomplished by literature review and interview with experts in the retrofitting field and BIM technologies. The results of this task are presented as an evaluation checklist of BIM ability to manage data and improve decision-making in retrofitting projects. The last activity involves an evaluation using the results of the previous tasks, to check how far the IFC format can support the requirements by each specialist, and its uses by third party software. The result indicates that BIM/LCA have a great potential to improve the retrofitting process in existing buildings, but some modification must be done in order to meet the requirements of the specialists for both, retrofitting and LCA evaluators.

Keywords: retrofitting, BIM, LCA, energy efficiency

Procedia PDF Downloads 206
15985 Consensus Reaching Process and False Consensus Effect in a Problem of Portfolio Selection

Authors: Viviana Ventre, Giacomo Di Tollo, Roberta Martino

Abstract:

The portfolio selection problem includes the evaluation of many criteria that are difficult to compare directly and is characterized by uncertain elements. The portfolio selection problem can be modeled as a group decision problem in which several experts are invited to present their assessment. In this context, it is important to study and analyze the process of reaching a consensus among group members. Indeed, due to the various diversities among experts, reaching consensus is not necessarily always simple and easily achievable. Moreover, the concept of consensus is accompanied by the concept of false consensus, which is particularly interesting in the dynamics of group decision-making processes. False consensus can alter the evaluation and selection phase of the alternative and is the consequence of the decision maker's inability to recognize that his preferences are conditioned by subjective structures. The present work aims to investigate the dynamics of consensus attainment in a group decision problem in which equivalent portfolios are proposed. In particular, the study aims to analyze the impact of the subjective structure of the decision-maker during the evaluation and selection phase of the alternatives. Therefore, the experimental framework is divided into three phases. In the first phase, experts are sent to evaluate the characteristics of all portfolios individually, without peer comparison, arriving independently at the selection of the preferred portfolio. The experts' evaluations are used to obtain individual Analytical Hierarchical Processes that define the weight that each expert gives to all criteria with respect to the proposed alternatives. This step provides insight into how the decision maker's decision process develops, step by step, from goal analysis to alternative selection. The second phase includes the description of the decision maker's state through Markov chains. In fact, the individual weights obtained in the first phase can be reviewed and described as transition weights from one state to another. Thus, with the construction of the individual transition matrices, the possible next state of the expert is determined from the individual weights at the end of the first phase. Finally, the experts meet, and the process of reaching consensus is analyzed by considering the single individual state obtained at the previous stage and the false consensus bias. The work contributes to the study of the impact of subjective structures, quantified through the Analytical Hierarchical Process, and how they combine with the false consensus bias in group decision-making dynamics and the consensus reaching process in problems involving the selection of equivalent portfolios.

Keywords: analytical hierarchical process, consensus building, false consensus effect, markov chains, portfolio selection problem

Procedia PDF Downloads 84
15984 The Tramway in French Cities: Complication of Public Spaces and Complexity of the Design Process

Authors: Elisa Maître

Abstract:

The redeployment of tram networks in French cities has considerably modified public spaces and the way citizens use them. Above and beyond the image that trams have of contributing to the sustainable urban development, the question of safety for users in these spaces has not been studied much. This study is based on an analysis of use of public spaces laid out for trams, from the standpoint of legibility and safety concerns. The study also examines to what extent the complexity of the design process, with many interactions between numerous and varied players in this process has a role in the genesis of these problems. This work is mainly based on the analysis of links between the uses of these re-designed public spaces (through observations, interviews of users and accident studies) and the analysis of the design conditions and processes of the projects studied (mainly based on interviews with the actors of these projects). Practical analyses were based three points of view: that of the planner, that of the user (based on observations and interviews) and that of the road safety expert. The cities of Montpellier, Marseille and Nice are the three fields of study on which the demonstration of this thesis is based. On part, the results of this study allow showing that the insertion of tram poses some problems complication of public areas of French cities. These complications related to the restructuring of public spaces for the tram, create difficulties of use and safety concerns. On the other hand, interviews depth analyses, fully transcribed, have led us to develop particular dysfunction scenarios in the design process. These elements lead to question the way the legibility and safety of these new forms of public spaces are taken into account. Then, an in-depth analysis of the design processes of public spaces with trams systems would also be a way of better understanding the choices made, the compromises accepted, and the conflicts and constraints at work, weighing on the layout of these spaces. The results presented concerning the impact that spaces laid out for trams have on the difficulty of use, suggest different possibilities for improving the way in which safety for all users is taken into account in designing public spaces.

Keywords: public spaces, road layout, users, design process of urban projects

Procedia PDF Downloads 217