Search results for: SPD process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15188

Search results for: SPD process

12008 Computational Analysis and Daily Application of the Key Neurotransmitters Involved in Happiness: Dopamine, Oxytocin, Serotonin, and Endorphins

Authors: Hee Soo Kim, Ha Young Kyung

Abstract:

Happiness and pleasure are a result of dopamine, oxytocin, serotonin, and endorphin levels in the body. In order to increase the four neurochemical levels, it is important to associate daily activities with its corresponding neurochemical releases. This includes setting goals, maintaining social relationships, laughing frequently, and exercising regularly. The likelihood of experiencing happiness increases when all four neurochemicals are released at the optimal level. The achievement of happiness is important because it increases healthiness, productivity, and the ability to overcome adversity. To process emotions, electrical brain waves, brain structure, and neurochemicals must be analyzed. This research uses Chemcraft and Avogadro to determine the theoretical and chemical properties of the four neurochemical molecules. Each neurochemical molecule’s thermodynamic stability is calculated to observe the efficiency of the molecules. The study found that among dopamine, oxytocin, serotonin, alpha-, beta-, and gamma-endorphin, beta-endorphin has the lowest optimized energy of 388.510 kJ/mol. Beta-endorphin, a neurotransmitter involved in mitigating pain and stress, is the most thermodynamically stable and efficient molecule that is involved in the process of happiness. Through examining such properties of happiness neurotransmitters, the science of happiness is better understood.

Keywords: happiness, neurotransmitters, positive psychology, dopamine, oxytocin, serotonin, endorphins

Procedia PDF Downloads 153
12007 Adapting Grain Crop Cleaning Equipment for Sesame and Other Emerging Spice Crops

Authors: Ramadas Narayanan, Surya Bhattrai, Vu Hoan

Abstract:

Threshing and cleaning are crucial post-harvest procedures that are carried out to separate the grain or seed from the harvested plant and eliminate any potential contaminants or foreign debris. After harvesting, threshing and cleaning are necessary for the clean seeds to guarantee high quality and acceptable for consumption or further processing. For mechanised production, threshing can be conducted in a thresher. Afterwards, the seeds are to be cleaned in dedicated seed-cleaning facilities. This research investigates the effectiveness of Kimseed cleaning equipment MK3, designed for grain crops for processing new crops such as sesame, fennel and kalonji. Subsequently, systematic trials were conducted to adapt the equipment to the applications in sesame and spice crops. It was done to develop methods for mechanising harvest and post-harvest operations. For sesame, it is recommended to have t a two-step process in the cleaning machine to remove large and small contaminants. The first step is to remove the large contaminants, and the second is to remove the smaller ones. The optimal parameters for cleaning fennel are a shaker frequency of 6.0 to 6.5 Hz and an airflow of 1.0 to 1.5 m/s. The optimal parameters for cleaning kalonji are a shaker frequency of 5.5Hz to 6.0 Hz and airflow of 1.0 to under 1.5m/s.

Keywords: sustainable mechanisation, sead cleaning process, optimal setting, shaker frequency

Procedia PDF Downloads 71
12006 Comparison of Unit Hydrograph Models to Simulate Flood Events at the Field Scale

Authors: Imene Skhakhfa, Lahbaci Ouerdachi

Abstract:

To ensure the overall coherence of simulated results, it is necessary to develop a robust validation process. In many applications, it is no longer content to calibrate and validate the model only in relation to the hydro graph measured at the outlet, but we try to better simulate the functioning of the watershed in space. Therefore the timing also performs compared to other variables such as water level measurements in intermediate stations or groundwater levels. As part of this work, we limit ourselves to modeling flood of short duration for which the process of evapotranspiration is negligible. The main parameters to identify the models are related to the method of unit hydro graph (HU). Three different models were tested: SNYDER, CLARK and SCS. These models differ in their mathematical structure and parameters to be calibrated while hydrological data are the same, the initial water content and precipitation. The models are compared on the basis of their performance in terms six objective criteria, three global criteria and three criteria representing volume, peak flow, and the mean square error. The first type of criteria gives more weight to strong events whereas the second considers all events to be of equal weight. The results show that the calibrated parameter values are dependent and also highlight the problems associated with the simulation of low flow events and intermittent precipitation.

Keywords: model calibration, intensity, runoff, hydrograph

Procedia PDF Downloads 484
12005 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network

Authors: Nasrin Bakhshizadeh, Ashkan Forootan

Abstract:

A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.

Keywords: polyethylene, polymerization, density, melt index, neural network

Procedia PDF Downloads 143
12004 Providing a Secure, Reliable and Decentralized Document Management Solution Using Blockchain by a Virtual Identity Card

Authors: Meet Shah, Ankita Aditya, Dhruv Bindra, V. S. Omkar, Aashruti Seervi

Abstract:

In today's world, we need documents everywhere for a smooth workflow in the identification process or any other security aspects. The current system and techniques which are used for identification need one thing, that is ‘proof of existence’, which involves valid documents, for example, educational, financial, etc. The main issue with the current identity access management system and digital identification process is that the system is centralized in their network, which makes it inefficient. The paper presents the system which resolves all these cited issues. It is based on ‘blockchain’ technology, which is a 'decentralized system'. It allows transactions in a decentralized and immutable manner. The primary notion of the model is to ‘have everything with nothing’. It involves inter-linking required documents of a person with a single identity card so that a person can go anywhere without having the required documents with him/her. The person just needs to be physically present at a place wherein documents are necessary, and using a fingerprint impression and an iris scan print, the rest of the verification will progress. Furthermore, some technical overheads and advancements are listed. This paper also aims to layout its far-vision scenario of blockchain and its impact on future trends.

Keywords: blockchain, decentralized system, fingerprint impression, identity management, iris scan

Procedia PDF Downloads 123
12003 Exploring Motivation and Attitude to Second Language Learning in Ugandan Secondary Schools

Authors: Nanyonjo Juliet

Abstract:

Across Sub-Saharan Africa, it’s increasingly becoming an absolute necessity for either parents or governments to encourage learners, most particularly those attending high schools, to study a second or foreign language other than the “official language” or the language of instruction in schools. The major second or foreign languages under consideration include but are not necessarily limited to English, French, German, Arabic, Swahili/Kiswahili, Spanish and Chinese. The benefits of learning a second (foreign) language in the globalized world cannot be underestimated. Amongst others, it has been expounded to especially involve such opportunities related to traveling, studying abroad and widening one’s career prospects. Research has also revealed that beyond these non-cognitive rewards, learning a second language enables learners to become more thoughtful, considerate and confident, make better decisions, keep their brain healthier and generally – speaking, broaden their world views. The methodology of delivering a successful 2nd language – learning process by a professionally qualified teacher is located in motivation. We strongly believe that the psychology involved in teaching a foreign language is of paramount importance to a learner’s successful learning experience. The aim of this paper, therefore, is to explore and show the importance of motivation in the teaching and learning of a given 2nd (foreign) language in the local Ugandan high schools.

Keywords: second language, foreign language, language learning, language teaching, official language, language of instruction, globalized world, cognitive rewards, non-cognitive rewards, learning process, motivation

Procedia PDF Downloads 66
12002 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data

Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan

Abstract:

Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.

Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data

Procedia PDF Downloads 439
12001 A Comprehensive Review of Foam Assisted Water Alternating Gas (FAWAG) Technique: Foam Applications and Mechanisms

Authors: A. Shabib-Asl, M. Abdalla Ayoub Mohammed, A. F. Alta’ee, I. Bin Mohd Saaid, P. Paulo Jose Valentim

Abstract:

In the last few decades, much focus has been placed on enhancing oil recovery from existing fields. This is accomplished by the study and application of various methods. As for recent cases, the study of fluid mobility control and sweep efficiency in gas injection process as well as water alternating gas (WAG) method have demonstrated positive results on oil recovery and thus gained wide interest in petroleum industry. WAG injection application results in an increased oil recovery. Its mechanism consists in reduction of gas oil ratio (GOR). However, there are some problems associated with this which includes poor volumetric sweep efficiency due to its low density and high mobility when compared with oil. This has led to the introduction of foam assisted water alternating gas (FAWAG) technique, which in contrast with WAG injection, acts in improving the sweep efficiency and reducing the gas oil ration therefore maximizing the production rate from the producer wells. This paper presents a comprehensive review of FAWAG process from perspective of Snorre field experience. In addition, some comparative results between FAWAG and the other EOR methods are presented including their setbacks. The main aim is to provide a solid background for future laboratory research and successful field application-extend.

Keywords: GOR, mobility ratio, sweep efficiency, WAG

Procedia PDF Downloads 450
12000 Collagen Gel in Hip Cartilage Repair: in vivo Preliminary Study

Authors: A. Bajek, J. Skopinska-Wisniewska, A. Rynkiewicz, A. Jundzill, M. Bodnar, A. Marszalek, T. Drewa

Abstract:

Traumatic injury and age-related degenerative diseases associated with cartilage are major health problems worldwide. The articular cartilage is comprised of a relatively small number of cells, which have a relatively slow rate of turnover. Therefore, damaged articular cartilage has a limited capacity for self-repair. New clinical methods have been designed to achieve better repair of injured cartilage. However, there is no treatment that enables full restoration of it. The aim of this study was to evaluate how collagen gel with bone marrow mesenchymal stem cells (MSCs) and collagen gel alone will influence on the hip cartilage repair after injury. Collagen type I was isolated from rats’ tails and cross-linked with N-hydroxysuccinimide in 24-hour process. MSCs were isolated from rats’ bone marrow. The experiments were conducted according to the guidelines for animal experiments of Ethics Committee. Fifteen 8-week-old Wistar rats were used in this study. All animals received hip joint surgery with a total of 30 created cartilage defects. Then, animals were randomly divided into three groups and filled, respectively, with collagen gel (group 1), collagen gel cultured with MSCs (group II) or left untreated as a control (control group). Immunohistochemy and radiological evaluation was carried out 11 weeks post implantation. It has been proved that the surface of the matrix is non-toxic, and its porosity promotes cell adhesion and growth. However, the in vivo regeneration process was poor. We observed the low integration rate of biomaterial. Immunohistochemical evaluation of cartilage after 11 weeks of treatment showed low II and high X collagen expression in two tested groups in comparison to the control one, in which we observed the high II collagen expression. What is more, after radiological analysis, we observed the best regeneration process in control group. The biomaterial construct and mesenchymal stem cells, as well as the use of the biomaterial itself was not sufficient to regenerate the hip cartilage surfaces. These results suggest that the collagen gel based biomaterials, even with MSCs, are not satisfactory in repar of hip cartilage defect. However, additional evaluation is needed to confirm these results.

Keywords: collafen gel, MSCs, cartilage repair, hip cartilage

Procedia PDF Downloads 453
11999 Effective Solvents for Proteins Recovery from Microalgae

Authors: Win Nee Phong, Tau Chuan Ling, Pau Loke Show

Abstract:

From an industrial perspective, the exploitation of microalgae for protein source is of great economical and commercial interest due to numerous attractive characteristics. Nonetheless, the release of protein from microalgae is limited by the multiple layers of the rigid thick cell wall that generally contain a large proportion of cellulose. Thus an efficient cell disruption process is required to rupture the cell wall. The conventional downstream processing methods which typically involve several unit operational steps such as disruption, isolation, extraction, concentration and purification are energy-intensive and costly. To reduce the overall cost and establish a feasible technology for the success of the large-scale production, microalgal industry today demands a more cost-effective and eco-friendly technique in downstream processing. One of the main challenges to extract the proteins from microalgae is the presence of rigid cell wall. This study aims to provide some guidance on the selection of the efficient solvent to facilitate the proteins released during the cell disruption process. The effects of solvent types such as methanol, ethanol, 1-propanol and water in rupturing the microalgae cell wall were studied. It is interesting to know that water is the most effective solvent to recover proteins from microalgae and the cost is cheapest among all other solvents.

Keywords: green, microalgae, protein, solvents

Procedia PDF Downloads 257
11998 Measurement of the Dynamic Modulus of Elasticity of Cylindrical Concrete Specimens Used for the Cyclic Indirect Tensile Test

Authors: Paul G. Bolz, Paul G. Lindner, Frohmut Wellner, Christian Schulze, Joern Huebelt

Abstract:

Concrete, as a result of its use as a construction material, is not only subject to static loads but is also exposed to variables, time-variant, and oscillating stresses. In order to ensure the suitability of construction materials for resisting these cyclic stresses, different test methods are used for the systematic fatiguing of specimens, like the cyclic indirect tensile test. A procedure is presented that allows the estimation of the degradation of cylindrical concrete specimens during the cyclic indirect tensile test by measuring the dynamic modulus of elasticity in different states of the specimens’ fatigue process. Two methods are used in addition to the cyclic indirect tensile test in order to examine the dynamic modulus of elasticity of cylindrical concrete specimens. One of the methods is based on the analysis of eigenfrequencies, whilst the other one uses ultrasonic pulse measurements to estimate the material properties. A comparison between the dynamic moduli obtained using the three methods that operate in different frequency ranges shows good agreement. The concrete specimens’ fatigue process can therefore be monitored effectively and reliably.

Keywords: concrete, cyclic indirect tensile test, degradation, dynamic modulus of elasticity, eigenfrequency, fatigue, natural frequency, ultrasonic, ultrasound, Young’s modulus

Procedia PDF Downloads 173
11997 Alternation of Executive Power and Democratic Governance in Nigeria: The Role of Independent National Electoral Commission, 1999-2014

Authors: J. Tochukwu Omenma

Abstract:

Buzzword in Nigeria is that democracy has “come to stay”. Politicians in their usual euphoria consider democracy as already consolidated in the country. Politicians linked this assumption to three fundamental indicators – (a) multiparty system; (b) regular elections and (c) absence of military coup after 15 years of democracy in Nigeria. Beyond this assumption, we intend to empirically verify these claims and assumptions, by relying on Huntington’s conceptualization of democratic consolidation. Though, Huntington asserts that multipartism, regular elections and absence of any major obstacle leading to reversal of democracy are significant indicators of democratic consolidation, but the presence of those indicators must result to alternation of executive power for democratic consolidation to occur. In other words, regular conduct of election and existence of multiple political parties are not enough for democratic consolidation, rather free and fair elections. Past elections were characterized of massive fraud and irregularities casting doubts on integrity of electoral management body (EMB) to conduct free and fair elections in Nigeria. There are three existing perspectives that have offered responses to the emasculation of independence of EMB. One is a more popular position indicating that the incumbent party, more than the opposition party, influence the EMB activities with the aim of rigging elections; the other is a more radical perspective that suggests that weakening of EMB power is more associated with the weakest party than with the incumbent; and the last, is that godfather(s) are in direct control of EMB members thereby controlling the process of electoral process to the advantage of the godfather(s). With empirical evidence sourced from the reports of independent election monitors, (European Union, Election Observation Mission in Nigeria) this paper shows at different electoral periods that, in terms of influencing election outcomes, the incumbent and godfather have been more associated with influencing election results than the opposition. The existing nature of executive power in Nigeria provides a plausible explanation for the incumbent’s overbearing influence thereby limiting opportunity for free and fair elections and by extension undermining the process of democratic consolidation in Nigeria.

Keywords: political party, democracy, democratic consolidation, election, godfatherism

Procedia PDF Downloads 490
11996 The Integration of ICT in the Teaching and Learning of French Language in Some Selected Schools in Nigeria: Prospects and Challenges

Authors: Oluyomi A. Abioye

Abstract:

The 21st century has been witnessing a lot of technological advancements and innovations, and Information and Communication Technology (ICT) happens to be one of them. Education is the cornerstone of any nation and the language in which it is delivered is the bedrock of any development. The French language is our choice in this study. French is a language of reference on the national and international scenes; however its teaching is clouded with myriads of problems. The output of students’ academic performance depends on to a large extent on the teaching and learning the process. The methodology employed goes a long way in contributing to the effectiveness of the teaching and learning the process. Therefore, with the integration of ICT, French teaching has to align with and adapt to this new digital era. An attempt is made to define the concept of ICT. Some of the challenges encountered in the teaching of French language are highlighted. Then it discusses the existing methods of French teaching and the integration of ICT in the teaching and learning of the same language. Then some prospects and challenges of ICT in the teaching and learning of French are discussed. Data collected from questionnaires administered among some students of some selected schools are analysed. Our findings revealed that only very few schools in Nigeria have the electronic and computer-mediated facilities to teach the French language. The paper concludes by encouraging 'savoir-faire' of ICT by the French teachers, an openness of students to this digital technology and adequate provision of electronic and computer-mediated gadgets by the Nigerian government to its educational institutions.

Keywords: French language in Nigeria, integration of ICT, prospects and challenges, teaching and learning

Procedia PDF Downloads 345
11995 Flexible Design Solutions for Complex Free form Geometries Aimed to Optimize Performances and Resources Consumption

Authors: Vlad Andrei Raducanu, Mariana Lucia Angelescu, Ion Cinca, Vasile Danut Cojocaru, Doina Raducanu

Abstract:

By using smart digital tools, such as generative design (GD) and digital fabrication (DF), problems of high actuality concerning resources optimization (materials, energy, time) can be solved and applications or products of free-form type can be created. In the new digital technology materials are active, designed in response to a set of performance requirements, which impose a total rethinking of old material practices. The article presents the design procedure key steps of a free-form architectural object - a column type one with connections to get an adaptive 3D surface, by using the parametric design methodology and by exploiting the properties of conventional metallic materials. In parametric design the form of the created object or space is shaped by varying the parameters values and relationships between the forms are described by mathematical equations. Digital parametric design is based on specific procedures, as shape grammars, Lindenmayer - systems, cellular automata, genetic algorithms or swarm intelligence, each of these procedures having limitations which make them applicable only in certain cases. In the paper the design process stages and the shape grammar type algorithm are presented. The generative design process relies on two basic principles: the modeling principle and the generative principle. The generative method is based on a form finding process, by creating many 3D spatial forms, using an algorithm conceived in order to apply its generating logic onto different input geometry. Once the algorithm is realized, it can be applied repeatedly to generate the geometry for a number of different input surfaces. The generated configurations are then analyzed through a technical or aesthetic selection criterion and finally the optimal solution is selected. Endless range of generative capacity of codes and algorithms used in digital design offers various conceptual possibilities and optimal solutions for both technical and environmental increasing demands of building industry and architecture. Constructions or spaces generated by parametric design can be specifically tuned, in order to meet certain technical or aesthetical requirements. The proposed approach has direct applicability in sustainable architecture, offering important potential economic advantages, a flexible design (which can be changed until the end of the design process) and unique geometric models of high performance.

Keywords: parametric design, algorithmic procedures, free-form architectural object, sustainable architecture

Procedia PDF Downloads 374
11994 Automated Manual Handling Risk Assessments: Practitioner Experienced Determinants of Automated Risk Analysis and Reporting Being a Benefit or Distraction

Authors: S. Cowley, M. Lawrance, D. Bick, R. McCord

Abstract:

Technology that automates manual handling (musculoskeletal disorder or MSD) risk assessments is increasingly available to ergonomists, engineers, generalist health and safety practitioners alike. The risk assessment process is generally based on the use of wearable motion sensors that capture information about worker movements for real-time or for posthoc analysis. Traditionally, MSD risk assessment is undertaken with the assistance of a checklist such as that from the SafeWork Australia code of practice, the expert assessor observing the task and ideally engaging with the worker in a discussion about the detail. Automation enables the non-expert to complete assessments and does not always require the assessor to be there. This clearly has cost and time benefits for the practitioner but is it an improvement on the assessment by the human. Human risk assessments draw on the knowledge and expertise of the assessor but, like all risk assessments, are highly subjective. The complexity of the checklists and models used in the process can be off-putting and sometimes will lead to the assessment becoming the focus and the end rather than a means to an end; the focus on risk control is lost. Automated risk assessment handles the complexity of the assessment for the assessor and delivers a simple risk score that enables decision-making regarding risk control. Being machine-based, they are objective and will deliver the same each time they assess an identical task. However, the WHS professional needs to know that this emergent technology asks the right questions and delivers the right answers. Whether it improves the risk assessment process and results or simply distances the professional from the task and the worker. They need clarity as to whether automation of manual task risk analysis and reporting leads to risk control or to a focus on the worker. Critically, they need evidence as to whether automation in this area of hazard management leads to better risk control or just a bigger collection of assessments. Practitioner experienced determinants of this automated manual task risk analysis and reporting being a benefit or distraction will address an understanding of emergent risk assessment technology, its use and things to consider when making decisions about adopting and applying these technologies.

Keywords: automated, manual-handling, risk-assessment, machine-based

Procedia PDF Downloads 118
11993 6,402: On the Aesthetic Experience of Facticity

Authors: Nicolás Rudas

Abstract:

Sociologists have brought to light the fascination of contemporary societies with numbers but fall short of explaining it. In their accounts, people generally misunderstand the technical intricacies of statistical knowledge and therefore accept numbers as unassailable “facts”. It is due to such pervasive fascination, furthermore, that both old and new forms of social control find fertile ground. By focusing on the process whereby the fetishization of numbers reaches its zenith, i.e., when specific statistics become emblematic of an entire society, it is asserted that numbers primarily function as moral symbols with immense potential for galvanizing collective action. Their “facticity” is not solely a cognitive problem but one that is deeply rooted in myth and connected with social experiences of epiphany and ritual. Evidence from Colombia is used to illustrate how certain quantifications become canonical. In 2021, Colombia’s Peace Court revealed that the national army had executed 6,402 innocent civilians to later report them as members of illegal armed groups. Rapidly, “6,402” transformed into a prominent item in the country’s political landscape. This article reconstructs such a process by following the first six months of the figure’s circulation, both in traditional and social media. In doing so, it is developed a new cultural-sociological conceptualization of numbers as “fact-icons” that departs from traditional understandings of statistics as “technical” objects. Numbers are icons whose appropriation is less rational than aesthetic.

Keywords: culture, statistics, collective memory, social movements

Procedia PDF Downloads 68
11992 The Hypoglycemic Grab Back (HOGG): Preparing Hypo-Screen-Bags to Streamline the Time-Consuming Process of Administering Glucose Systemic Correction

Authors: Mai Ali

Abstract:

Background: Preparing Hypo-screen-bags in advance streamlines the time-consuming process of administering glucose systemic correction. Additionally, Hypo-Screen Grab Bags are widely adopted in UK hospitals. Aim: The aim of the study is to improve hypoglycemia screening efficiency and equipment accessibility by streamlining item access to grab bag restocking staff. Methodology: The study centered on neonatal wards at LGI & St. James Neonatal Unit and related units. A web-based survey was conducted to evaluate local practices, gathering 21 responses from relevant general staff. The survey outcomes: (1) The demand for accessible grab bags is evident for smoother processes. (2) The potential to enhance efficiency through improved preparation of hypo-screen grab bags. Intervention: A Hypo-Screen Grab Bag was designed, including checklists for stocked items and required samples. Medical staff oversee restocking after use. Conclusion: The study successfully improved hypoglycemia screening efficiency and aided junior staff with accessible supplies and a user-friendly checklist.

Keywords: neonatal hypoglycemia, grab bag, hypo-screening, junior staff

Procedia PDF Downloads 60
11991 Literature Review and Approach for the Use of Digital Factory Models in an Augmented Reality Application for Decision Making in Restructuring Processes

Authors: Rene Hellmuth, Jorg Frohnmayer

Abstract:

The requirements of the factory planning and the building concerned have changed in the last years. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring gains more importance in order to maintain the competitiveness of a factory. Even today, the methods and process models used in factory planning are predominantly based on the classical planning principles of Schmigalla, Aggteleky and Kettner, which, however, are not specifically designed for reorganization. In addition, they are designed for a largely static environmental situation and a manageable planning complexity as well as for medium to long-term planning cycles with a low variability of the factory. Existing approaches already regard factory planning as a continuous process that makes it possible to react quickly to adaptation requirements. However, digital factory models are not yet used as a source of information for building data. Approaches which consider building information modeling (BIM) or digital factory models in general either do not refer to factory conversions or do not yet go beyond a concept. This deficit can be further substantiated. A method for factory conversion planning using a current digital building model is lacking. A corresponding approach must take into account both the existing approaches to factory planning and the use of digital factory models in practice. A literature review will be conducted first. In it, approaches to classic factory planning and approaches to conversion planning are examined. In addition, it will be investigated which approaches already contain digital factory models. In the second step, an approach is presented how digital factory models based on building information modeling can be used as a basis for augmented reality tablet applications. This application is suitable for construction sites and provides information on the costs and time required for conversion variants. Thus a fast decision making is supported. In summary, the paper provides an overview of existing factory planning approaches and critically examines the use of digital tools. Based on this preliminary work, an approach is presented, which suggests the sensible use of digital factory models for decision support in the case of conversion variants of the factory building. The augmented reality application is designed to summarize the most important information for decision-makers during a reconstruction process.

Keywords: augmented reality, digital factory model, factory planning, restructuring

Procedia PDF Downloads 137
11990 CTHTC: A Convolution-Backed Transformer Architecture for Temporal Knowledge Graph Embedding with Periodicity Recognition

Authors: Xinyuan Chen, Mohd Nizam Husen, Zhongmei Zhou, Gongde Guo, Wei Gao

Abstract:

Temporal Knowledge Graph Completion (TKGC) has attracted increasing attention for its enormous value; however, existing models lack capabilities to capture both local interactions and global dependencies simultaneously with evolutionary dynamics, while the latest achievements in convolutions and Transformers haven't been employed in this area. What’s more, periodic patterns in TKGs haven’t been fully explored either. To this end, a multi-stage hybrid architecture with convolution-backed Transformers is introduced in TKGC tasks for the first time combining the Hawkes process to model evolving event sequences in a continuous-time domain. In addition, the seasonal-trend decomposition is adopted to identify periodic patterns. Experiments on six public datasets are conducted to verify model effectiveness against state-of-the-art (SOTA) methods. An extensive ablation study is carried out accordingly to evaluate architecture variants as well as the contributions of independent components in addition, paving the way for further potential exploitation. Besides complexity analysis, input sensitivity and safety challenges are also thoroughly discussed for comprehensiveness with novel methods.

Keywords: temporal knowledge graph completion, convolution, transformer, Hawkes process, periodicity

Procedia PDF Downloads 78
11989 Review of Life-Cycle Analysis Applications on Sustainable Building and Construction Sector as Decision Support Tools

Authors: Liying Li, Han Guo

Abstract:

Considering the environmental issues generated by the building sector for its energy consumption, solid waste generation, water use, land use, and global greenhouse gas (GHG) emissions, this review pointed out to LCA as a decision-support tool to substantially improve the sustainability in the building and construction industry. The comprehensiveness and simplicity of LCA make it one of the most promising decision support tools for the sustainable design and construction of future buildings. This paper contains a comprehensive review of existing studies related to LCAs with a focus on their advantages and limitations when applied in the building sector. The aim of this paper is to enhance the understanding of a building life-cycle analysis, thus promoting its application for effective, sustainable building design and construction in the future. Comparisons and discussions are carried out between four categories of LCA methods: building material and component combinations (BMCC) vs. the whole process of construction (WPC) LCA,attributional vs. consequential LCA, process-based LCA vs. input-output (I-O) LCA, traditional vs. hybrid LCA. Classical case studies are presented, which illustrate the effectiveness of LCA as a tool to support the decisions of practitioners in the design and construction of sustainable buildings. (i) BMCC and WPC categories of LCA researches tend to overlap with each other, as majority WPC LCAs are actually developed based on a bottom-up approach BMCC LCAs use. (ii) When considering the influence of social and economic factors outside the proposed system by research, a consequential LCA could provide a more reliable result than an attributional LCA. (iii) I-O LCA is complementary to process-based LCA in order to address the social and economic problems generated by building projects. (iv) Hybrid LCA provides a more superior dynamic perspective than a traditional LCA that is criticized for its static view of the changing processes within the building’s life cycle. LCAs are still being developed to overcome their limitations and data shortage (especially data on the developing world), and the unification of LCA methods and data can make the results of building LCA more comparable and consistent across different studies or even countries.

Keywords: decision support tool, life-cycle analysis, LCA tools and data, sustainable building design

Procedia PDF Downloads 120
11988 Four-Electron Auger Process for Hollow Ions

Authors: Shahin A. Abdel-Naby, James P. Colgan, Michael S. Pindzola

Abstract:

A time-dependent close-coupling method is developed to calculate a total, double and triple autoionization rates for hollow atomic ions of four-electron systems. This work was motivated by recent observations of the four-electron Auger process in near K-edge photoionization of C+ ions. The time-dependent close-coupled equations are solved using lattice techniques to obtain a discrete representation of radial wave functions and all operators on a four-dimensional grid with uniform spacing. Initial excited states are obtained by relaxation of the Schrodinger equation in imaginary time using a Schmidt orthogonalization method involving interior subshells. The radial wave function grids are partitioned over the cores on a massively parallel computer, which is essential due to the large memory requirements needed to store the coupled-wave functions and the long run times needed to reach the convergence of the ionization process. Total, double, and triple autoionization rates are obtained by the propagation of the time-dependent close-coupled equations in real-time using integration over bound and continuum single-particle states. These states are generated by matrix diagonalization of one-electron Hamiltonians. The total autoionization rates for each L excited state is found to be slightly above the single autoionization rate for the excited configuration using configuration-average distorted-wave theory. As expected, we find the double and triple autoionization rates to be much smaller than the total autoionization rates. Future work can be extended to study electron-impact triple ionization of atoms or ions. The work was supported in part by grants from the American University of Sharjah and the US Department of Energy. Computational work was carried out at the National Energy Research Scientific Computing Center (NERSC) in Berkeley, California, USA.

Keywords: hollow atoms, autoionization, auger rates, time-dependent close-coupling method

Procedia PDF Downloads 153
11987 Fabrication of Aluminum Nitride Thick Layers by Modified Reactive Plasma Spraying

Authors: Cécile Dufloux, Klaus Böttcher, Heike Oppermann, Jürgen Wollweber

Abstract:

Hexagonal aluminum nitride (AlN) is a promising candidate for several wide band gap semiconductor compound applications such as deep UV light emitting diodes (UVC LED) and fast power transistors (HEMTs). To date, bulk AlN single crystals are still commonly grown from the physical vapor transport (PVT). Single crystalline AlN wafers obtained from this process could offer suitable substrates for a defect-free growth of ultimately active AlGaN layers, however, these wafers still lack from small sizes, limited delivery quantities and high prices so far.Although there is already an increasing interest in the commercial availability of AlN wafers, comparatively cheap Si, SiC or sapphire are still predominantly used as substrate material for the deposition of active AlGaN layers. Nevertheless, due to a lattice mismatch up to 20%, the obtained material shows high defect densities and is, therefore, less suitable for high power devices as described above. Therefore, the use of AlN with specially adapted properties for optical and sensor applications could be promising for mass market products which seem to fulfill fewer requirements. To respond to the demand of suitable AlN target material for the growth of AlGaN layers, we have designed an innovative technology based on reactive plasma spraying. The goal is to produce coarse grained AlN boules with N-terminated columnar structure and high purity. In this process, aluminum is injected into a microwave stimulated nitrogen plasma. AlN, as the product of the reaction between aluminum powder and the plasma activated N2, is deposited onto the target. We used an aluminum filament as the initial material to minimize oxygen contamination during the process. The material was guided through the nitrogen plasma so that the mass turnover was 10g/h. To avoid any impurity contamination by an erosion of the electrodes, an electrode-less discharge was used for the plasma ignition. The pressure was maintained at 600-700 mbar, so the plasma reached a temperature high enough to vaporize the aluminum which subsequently was reacting with the surrounding plasma. The obtained products consist of thick polycrystalline AlN layers with a diameter of 2-3 cm. The crystallinity was determined by X-ray crystallography. The grain structure was systematically investigated by optical and scanning electron microscopy. Furthermore, we performed a Raman spectroscopy to provide evidence of stress in the layers. This paper will discuss the effects of process parameters such as microwave power and deposition geometry (specimen holder, radiation shields, ...) on the topography, crystallinity, and stress distribution of AlN.

Keywords: aluminum nitride, polycrystal, reactive plasma spraying, semiconductor

Procedia PDF Downloads 280
11986 The Syllable Structure and Syllable Processes in Suhwa Arabic: An Autosegmental Analysis

Authors: Muhammad Yaqub Olatunde

Abstract:

Arabic linguistic science is redirecting its focus towards the analysis and description of social, regional, and temporal varieties of social, regional, and temporal varieties in order to show how they vary in pronunciation, vocabulary, and grammar. This is not to say that the traditional Arabic linguists did not mention scores of dialectical variations but such works focused on the geographical boundaries of the Arabic speaking countries. There is need for a comprehensive survey of various Arabic dialects within the boundary of Arabic speaking countries and outside showing both the similarities and differences of linguistic and extra linguistic elements. This study therefore examines the syllable structure and process in noun and verb in the shuwa Arabic dialect speaking in North East Nigeria [mainly in Borno state]. The work seeks to establish the facts about this phenomenon, using auto- segmental analysis. These facts are compared, where necessary; using possible alternative analysis, with what operate in other related dialects within and outside Arabic speaking countries. The interaction between epenthesis and germination in the language also generate an interesting issue. The paper then conclude that syllable structure and process in the language need to recognize the existence of complex onset and a complex rhyme producing a consonant cluster in the former and a closed syllable in the letter. This emerges as result of resyllabification, which is motivated by these processes.

Keywords: Arabic, dialect, linguistics, processes, resyllabification

Procedia PDF Downloads 420
11985 Effect of Polymer Residues for Wastewater Treatment from Petroleum Production

Authors: Chayonnat Thanamun, Kreangkrai Maneeintr

Abstract:

For petroleum industry, polymer flooding is the one of the main methods in enhanced oil recovery (EOR) that is used water-soluble polymer such as partially hydrolyzed polyacrylamide (HPAM) to increase oil production. It is added to the flooding water to improve the mobility ratio in the flooding process. During the polymer flooding process, water is produced as a by-product along with oil and gas production. This produced water is a mixture of inorganic and organic compound. Moreover, produced water is more difficult to treat than that from water flooding. In this work, the effect of HPAM residue on the wastewater treatment from polymer flooding is studied. Polyaluminium chloride (PAC) is selected to use as a flocculant. Therefore, the objective of this study is to evaluate the effect of polymer residues in produced water on the wastewater treatment by using PAC. The operating parameters of this study are flocculant dosage ranging from 300,400 and 500 mg/L temperature from 30-50 Celsius degree and HPAM concentrations from 500, 1000 and 2000 mg/L. Furthermore, the turbidity, as well as total suspended solids (TSS), are also studied. The results indicated that with an increase in HPAM concentration, the TSS and turbidity increase gradually with the increasing of coagulant dosage under the same temperature. Also, the coagulation-flocculation performance is improved with the increasing temperature. This can be applied to use in the wastewater treatment from oil production before this water can be injected back to the reservoir.

Keywords: wastewater treatment, petroleum production, polyaluminium chloride, polyacrylamide

Procedia PDF Downloads 152
11984 The Effect of Austempering Temperature on Anisotropy of TRIP Steel

Authors: Abdolreza Heidari Noosh Abad, Amir Abedi, Davood Mirahmadi khaki

Abstract:

The high strength and flexibility of TRIP steels are the major reasons for them being widely used in the automobile industry. Deep drawing is regarded as a common metal sheet manufacturing process is used extensively in the modern industry, particularly automobile industry. To investigate the potential of deep drawing characteristic of materials, steel sheet anisotropy is studied and expressed as R-Value. The TRIP steels have a multi-phase microstructure consisting typically of ferrite, bainite and retained austenite. The retained austenite appears to be the most effective phase in the microstructure of the TRIP steels. In the present research, Taguchi method has been employed to study investigates the effect of austempering temperature parameters on the anisotropy property of the TRIP steel. To achieve this purpose, a steel with chemical composition of 0.196C -1.42Si-1.41Mn, has been used and annealed at 810oC, and then austempered at 340-460oC for 3, 6, and 9 minutes. The results shows that the austempering temperature has a direct relationship with R-value, respectively. With increasing austempering temperature, residual austenite grain size increases as well as increased solubility, which increases the amount of R-value. According to the results of the Taguchi method, austempering temperature’s p-value less than 0.05 is due to effective on R-value.

Keywords: Taguchi method, hot rolling, thermomechanical process, anisotropy, R-value

Procedia PDF Downloads 323
11983 Color Conversion Films with CuInS2/ZnS Quantum Dots Embedded Polystyrene Nanofibers by Electrospinning Process

Authors: Wonkyung Na, Namhun Kim, Heeyeop Chae

Abstract:

Quantum dots (QDs) are getting attentions due to their excellent optical properties in display, solar cell, biomolecule detection and lighting applications. Energy band gap can be easilty controlled by controlling their size and QDs are proper to apply in light-emitting-diode(LED) and lighting application, especially. Typically cadmium (Cd) containing QDs show a narrow photoluminescence (PL) spectrum and high quantum yield. However, Cd is classified as a hazardous materials and the use of Cd is being tightly regulated under 100ppm level in many countries.InP and CuInS2 (CIS) are being investigated as Cd-free QD materials and it is recently demonstrated that the performance of those Cd-free QDs is comparable to their Cd-based rivals.Due to a broad emission spectrum, CuInS2 QDs are also proper to be applied to white LED.4 For the lighting applications, the QD should be made in forms of color conversion films. Various film processes are reported with QDs in polymer matrixes. In this work, we synthesized the CuInS2 (CIS) QDs and QD embedded polystyrene color conversion films were fabricated for white color emission with electro-spinning process. As a result, blue light from blue LED is converted to white light with high color rendering index (CRI) of 72 by the color conversion films.

Keywords: CuInS2/ZnS, electro-spinning, color conversion films, white light emitting diodes

Procedia PDF Downloads 812
11982 Impact Logistic Management to Reduce Costs

Authors: Waleerak Sittisom

Abstract:

The objectives of this research were to analyze transportation route management, to identify potential cost reductions in logistic operation. In-depth interview techniques and small group discussions were utilized with 25 participants from various backgrounds in the areas of logistics. The findings of this research revealed that there were four areas that companies are able to effectively manage a logistic cost reduction: managing the space within the transportation vehicles, managing transportation personnel, managing transportation cost, and managing control of transportation. On the other hand, there were four areas that companies were unable to effectively manage a logistic cost reduction: the working process of transportation, the route planning of transportation, the service point management, and technology management. There are five areas that cost reduction is feasible: personnel management, process of working, map planning, service point planning, and technology implementation. To be able to reduce costs, the transportation companies should suggest that customers use a file system to save truck space. Also, the transportation companies need to adopt new technology to manage their information system so that packages can be reached easy, safe, and fast. Staff needs to be trained regularly to increase knowledge and skills. Teamwork is required to effectively reduce the costs.

Keywords: cost reduction, management, logistics, transportation

Procedia PDF Downloads 497
11981 Application of Simulated Annealing to Threshold Optimization in Distributed OS-CFAR System

Authors: L. Abdou, O. Taibaoui, A. Moumen, A. Talib Ahmed

Abstract:

This paper proposes an application of the simulated annealing to optimize the detection threshold in an ordered statistics constant false alarm rate (OS-CFAR) system. Using conventional optimization methods, such as the conjugate gradient, can lead to a local optimum and lose the global optimum. Also for a system with a number of sensors that is greater than or equal to three, it is difficult or impossible to find this optimum; Hence, the need to use other methods, such as meta-heuristics. From a variety of meta-heuristic techniques, we can find the simulated annealing (SA) method, inspired from a process used in metallurgy. This technique is based on the selection of an initial solution and the generation of a near solution randomly, in order to improve the criterion to optimize. In this work, two parameters will be subject to such optimisation and which are the statistical order (k) and the scaling factor (T). Two fusion rules; “AND” and “OR” were considered in the case where the signals are independent from sensor to sensor. The results showed that the application of the proposed method to the problem of optimisation in a distributed system is efficiency to resolve such problems. The advantage of this method is that it allows to browse the entire solutions space and to avoid theoretically the stagnation of the optimization process in an area of local minimum.

Keywords: distributed system, OS-CFAR system, independent sensors, simulating annealing

Procedia PDF Downloads 496
11980 New Bio-Strategies for Ochratoxin a Detoxification Using Lactic Acid Bacteria

Authors: José Maria, Vânia Laranjo, Luís Abrunhosa, António Inês

Abstract:

The occurrence of mycotoxigenic moulds such as Aspergillus, Penicillium and Fusarium in food and feed has an important impact on public health, by the appearance of acute and chronic mycotoxicoses in humans and animals, which is more severe in the developing countries due to lack of food security, poverty and malnutrition. This mould contamination also constitutes a major economic problem due the lost of crop production. A great variety of filamentous fungi is able to produce highly toxic secondary metabolites known as mycotoxins. Most of the mycotoxins are carcinogenic, mutagenic, neurotoxic and immunosuppressive, being ochratoxin A (OTA) one of the most important. OTA is toxic to animals and humans, mainly due to its nephrotoxic properties. Several approaches have been developed for decontamination of mycotoxins in foods, such as, prevention of contamination, biodegradation of mycotoxins-containing food and feed with microorganisms or enzymes and inhibition or absorption of mycotoxin content of consumed food into the digestive tract. Some group of Gram-positive bacteria named lactic acid bacteria (LAB) are able to release some molecules that can influence the mould growth, improving the shelf life of many fermented products and reducing health risks due to exposure to mycotoxins. Some LAB are capable of mycotoxin detoxification. Recently our group was the first to describe the ability of LAB strains to biodegrade OTA, more specifically, Pediococcus parvulus strains isolated from Douro wines. The pathway of this biodegradation was identified previously in other microorganisms. OTA can be degraded through the hydrolysis of the amide bond that links the L-β-phenylalanine molecule to the ochratoxin alpha (OTα) a non toxic compound. It is known that some peptidases from different origins can mediate the hydrolysis reaction like, carboxypeptidase A an enzyme from the bovine pancreas, a commercial lipase and several commercial proteases. So, we wanted to have a better understanding of this OTA degradation process when LAB are involved and identify which molecules where present in this process. For achieving our aim we used some bioinformatics tools (BLAST, CLUSTALX2, CLC Sequence Viewer 7, Finch TV). We also designed specific primers and realized gene specific PCR. The template DNA used came from LAB strains samples of our previous work, and other DNA LAB strains isolated from elderberry fruit, silage, milk and sausages. Through the employment of bioinformatics tools it was possible to identify several proteins belonging to the carboxypeptidase family that participate in the process of OTA degradation, such as serine type D-Ala-D-Ala carboxypeptidase and membrane carboxypeptidase. In conclusions, this work has identified carboxypeptidase proteins being one of the molecules present in the OTA degradation process when LAB are involved.

Keywords: carboxypeptidase, lactic acid bacteria, mycotoxins, ochratoxin a.

Procedia PDF Downloads 461
11979 User-Driven Product Line Engineering for Assembling Large Families of Software

Authors: Zhaopeng Xuan, Yuan Bian, C. Cailleaux, Jing Qin, S. Traore

Abstract:

Traditional software engineering allows engineers to propose to their clients multiple specialized software distributions assembled from a shared set of software assets. The management of these assets however requires a trade-off between client satisfaction and software engineering process. Clients have more and more difficult to find a distribution or components based on their needs from all of distributed repositories. This paper proposes a software engineering for a user-driven software product line in which engineers define a feature model but users drive the actual software distribution on demand. This approach makes the user become final actor as a release manager in software engineering process, increasing user product satisfaction and simplifying user operations to find required components. In addition, it provides a way for engineers to manage and assembly large software families. As a proof of concept, a user-driven software product line is implemented for eclipse, an integrated development environment. An eclipse feature model is defined, which is exposed to users on a cloud-based built platform from which clients can download individualized Eclipse distributions.

Keywords: software product line, model-driven development, reverse engineering and refactoring, agile method

Procedia PDF Downloads 430