Search results for: open flow
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7594

Search results for: open flow

4474 Non–Geometric Sensitivities Using the Adjoint Method

Authors: Marcelo Hayashi, João Lima, Bruno Chieregatti, Ernani Volpe

Abstract:

The adjoint method has been used as a successful tool to obtain sensitivity gradients in aerodynamic design and optimisation for many years. This work presents an alternative approach to the continuous adjoint formulation that enables one to compute gradients of a given measure of merit with respect to control parameters other than those pertaining to geometry. The procedure is then applied to the steady 2–D compressible Euler and incompressible Navier–Stokes flow equations. Finally, the results are compared with sensitivities obtained by finite differences and theoretical values for validation.

Keywords: adjoint method, aerodynamics, sensitivity theory, non-geometric sensitivities

Procedia PDF Downloads 547
4473 Sustainable Manufacturing of Concentrated Latex and Ribbed Smoked Sheets in Sri Lanka

Authors: Pasan Dunuwila, V. H. L. Rodrigo, Naohiro Goto

Abstract:

Sri Lanka is one the largest natural rubber (NR) producers of the world, where the NR industry is a major foreign exchange earner. Among the locally manufactured NR products, concentrated latex (CL) and ribbed smoked sheets (RSS) hold a significant position. Furthermore, these products become the foundation for many products utilized by the people all over the world (e.g. gloves, condoms, tires, etc.). Processing of CL and RSS costs a significant amount of material, energy, and workforce. With this background, both manufacturing lines have immensely challenged by waste, low productivity, lack of cost efficiency, rising cost of production, and many environmental issues. To face the above challenges, the adaptation of sustainable manufacturing measures that use less energy, water, materials, and produce less waste is imperative. However, these sectors lack comprehensive studies that shed light on such measures and thoroughly discuss their improvement potentials from both environmental and economic points of view. Therefore, based on a study of three CL and three RSS mills in Sri Lanka, this study deploys sustainable manufacturing techniques and tools to uncover the underlying potentials to improve performances in CL and RSS processing sectors. This study is comprised of three steps: 1. quantification of average material waste, economic losses, and greenhouse gas (GHG) emissions via material flow analysis (MFA), material flow cost accounting (MFCA), and life cycle assessment (LCA) in each manufacturing process, 2. identification of improvement options with the help of Pareto and What-if analyses, field interviews, and the existing literature; and 3. validation of the identified improvement options via the re-execution of MFA, MFCA, and LCA. With the help of this methodology, the economic and environmental hotspots, and the degrees of improvement in both systems could be identified. Results highlighted that each process could be improved to have less waste, monetary losses, manufacturing costs, and GHG emissions. Conclusively, study`s methodology and findings are believed to be beneficial for assuring the sustainable growth not only in Sri Lankan NR processing sector itself but also in NR or any other industry rooted in other developing countries.

Keywords: concentrated latex, natural rubber, ribbed smoked sheets, Sri Lanka

Procedia PDF Downloads 261
4472 Teamwork on Innovation in Young Enterprises: A Qualitative Analysis

Authors: Polina Trusova

Abstract:

The majority of young enterprises is founded and run by teams and develops new, innovative products or services. While problems within the team are considered to be an important reason for the failure of young enterprises, effective teamwork on innovation may be a key success factor. It may require special teamwork design or members’ creativity not needed during work routine. However, little is known about how young enterprises develop innovative solutions in teams, what makes their teamwork special and what influences its effectivity. Extending this knowledge is essential for understanding the success and failure factors for young enterprises. Previous research focused on working on innovation or professional teams in general. Rare studies combining these issues usually concentrate on homogenous groups like IT expert teams in innovation projects of big, well-established firms. The transferability of those studies’ findings to the entrepreneurial context is doubtful because of several reasons why teamwork should differ significantly between big, well-established firms and young enterprises. First, teamwork is conducted by team members, e.g., employees. The personality of employees in young enterprises, in contrast to that of employees in established firms, has been shown to be more similar to the personality of entrepreneurs. As entrepreneurs were found to be more open to experience and show less risk aversion, it may have a positive impact on their teamwork. Persons open to novelty are more likely to develop or accept a creative solution, which is especially important for teamwork on innovation. Secondly, young enterprises are often characterized by a flat hierarchy, so in general, teamwork should be more participative there. It encourages each member (and not only the founder) to produce and discuss innovative ideas, increasing their variety and enabling the team to select the best idea from the larger idea pool. Thirdly, teams in young enterprises are often multidisciplinary. It has some advantages but also increases the risk of internal conflicts making teamwork less effective. Despite the key role of teamwork on innovation and presented barriers for transferring existing evidence to the context of young enterprises, only a few researchers have addressed this issue. In order to close the existing research gap, to explore and understand how innovations are developed in teams of young enterprises and which factors influencing teamwork may be especially relevant for such teams, a qualitative study has been developed. The study consisting of 20 half-structured interviews with (co-)founders of young innovative enterprises in the UK and USA started in September 2017. The interview guide comprises but is not limited to teamwork dimensions discussed in literature like members’ skill or authority differentiation. Data will be evaluated following the rules of qualitative content analysis. First results indicate some factors which may be relevant especially for teamwork in young innovative enterprises. They will enrich the scientific discussion and provide the evidence needed to test a possible causality between identified factors and teamwork effectivity in future research on young innovative enterprises. Results and their discussion can be presented at the conference.

Keywords: innovation, qualitative study, teamwork, young enterprises

Procedia PDF Downloads 198
4471 Numerical Modeling of the Depth-Averaged Flow over a Hill

Authors: Anna Avramenko, Heikki Haario

Abstract:

This paper reports the development and application of a 2D depth-averaged model. The main goal of this contribution is to apply the depth averaged equations to a wind park model in which the treatment of the geometry, introduced on the mathematical model by the mass and momentum source terms. The depth-averaged model will be used in future to find the optimal position of wind turbines in the wind park. K-E and 2D LES turbulence models were consider in this article. 2D CFD simulations for one hill was done to check the depth-averaged model in practise.

Keywords: depth-averaged equations, numerical modeling, CFD, wind park model

Procedia PDF Downloads 603
4470 Development of Plantar Insoles Reinforcement Using Biocomposites

Authors: A. C. Vidal, D. R. Mulinari, C. F. Bandeira, S. R. Montoro

Abstract:

Due to the great effort suffered by foot during movement, is of great importance to count on a shoe that has a proper structure and excellent support tread to prevent the immediate and long-term consequences in all parts of the body. In this sense, new reinforcements of insoles with high impact absorption were developed in this work, from a polyurethane (PU) biocomposite derived from castor oil reinforced or not with palm fibers. These insoles have been obtained from the mixture with polyol prepolymer (diisocyanate) and subsequently were evaluated morphologically, mechanically and by thermal analysis. The results revealed that the biocomposites showed lower flexural strength, higher impact strength and open interconnected pores in their microstructure, but with smaller cells and degradation temperature slightly higher compared to the marketed material, showing interesting properties for a possible application as reinforcement of insoles.

Keywords: composite, polyurethane insole, palm fibers, plantar insoles reinforcement

Procedia PDF Downloads 417
4469 Scalable Learning of Tree-Based Models on Sparsely Representable Data

Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou

Abstract:

Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.

Keywords: big data, sparsely representable data, tree-based models, scalable learning

Procedia PDF Downloads 263
4468 Teachers’ Awareness of the Significance of Lifelong Learning: A Case Study of Secondary School Teachers of Batna - Algeria

Authors: Bahloul Amel

Abstract:

This study is an attempt to raise the awareness of the stakeholders and the authorities on the sensitivity of Algerian secondary school teachers of English as a Foreign Language about the students’ loss of English language skills learned during formal schooling with effort and at expense and the supposed measures to arrest that loss. Data was collected from secondary school teachers of EFL and analyzed quantitatively using a questionnaire containing open-ended and close-ended questions. The results advocate a consensus about the need for actions to be adopted to make assessment techniques outcome-oriented. Most of the participants were in favor of including curricular activities involving contextualized learning, problem-solving learning critical self-awareness, self and peer-assisted learning, use of computers and internet so as to make learners autonomous.

Keywords: lifelong learning, EFL, contextualized learning, Algeria

Procedia PDF Downloads 348
4467 Distorted Document Images Dataset for Text Detection and Recognition

Authors: Ilia Zharikov, Philipp Nikitin, Ilia Vasiliev, Vladimir Dokholyan

Abstract:

With the increasing popularity of document analysis and recognition systems, text detection (TD) and optical character recognition (OCR) in document images become challenging tasks. However, according to our best knowledge, no publicly available datasets for these particular problems exist. In this paper, we introduce a Distorted Document Images dataset (DDI-100) and provide a detailed analysis of the DDI-100 in its current state. To create the dataset we collected 7000 unique document pages, and extend it by applying different types of distortions and geometric transformations. In total, DDI-100 contains more than 100,000 document images together with binary text masks, text and character locations in terms of bounding boxes. We also present an analysis of several state-of-the-art TD and OCR approaches on the presented dataset. Lastly, we demonstrate the usefulness of DDI-100 to improve accuracy and stability of the considered TD and OCR models.

Keywords: document analysis, open dataset, optical character recognition, text detection

Procedia PDF Downloads 173
4466 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: multi-objective, analysis, data flow, freight delivery, methodology

Procedia PDF Downloads 180
4465 Students’ Opinions Related to Virtual Classrooms within the Online Distance Education Graduate Program

Authors: Secil Kaya Gulen

Abstract:

Face to face and virtual classrooms that came up with different conditions and environments, but similar purposes have different characteristics. Although virtual classrooms have some similar facilities with face-to-face classes such as program, students, and administrators, they have no walls and corridors. Therefore, students can attend the courses from a distance and can control their own learning spaces. Virtual classrooms defined as simultaneous online environments where students in different places come together at the same time with the guidance of a teacher. Distance education and virtual classes require different intellectual and managerial skills and models. Therefore, for effective use of virtual classrooms, the virtual property should be taken into consideration. One of the most important factors that affect the spread and effective use of the virtual classrooms is the perceptions and opinions of students -as one the main participants-. Student opinions and recommendations are important in terms of providing information about the fulfillment of expectation. This will help to improve the applications and contribute to the more efficient implementations. In this context, ideas and perceptions of the students related to the virtual classrooms, in general, were determined in this study. Advantages and disadvantages of virtual classrooms expected contributions to the educational system and expected characteristics of virtual classrooms have examined in this study. Students of an online distance education graduate program in which all the courses offered by virtual classrooms have asked for their opinions. Online Distance Education Graduate Program has totally 19 students. The questionnaire that consists of open-ended and multiple choice questions sent to these 19 students and finally 12 of them answered the questionnaire. Analysis of the data presented as frequencies and percentages for each item. SPSS for multiple-choice questions and Nvivo for open-ended questions were used for analyses. According to the results obtained by the analysis, participants stated that they did not get any training on virtual classes before the courses; but they emphasize that newly enrolled students should be educated about the virtual classrooms. In addition, all participants mentioned that virtual classroom contribute their personal development and they want to improve their skills by gaining more experience. The participants, who mainly emphasize the advantages of virtual classrooms, express that the dissemination of virtual classrooms will contribute to the Turkish Education System. Within the advantages of virtual classrooms, ‘recordable and repeatable lessons’ and ‘eliminating the access and transportation costs’ are most common advantages according to the participants. On the other hand, they mentioned ‘technological features and keyboard usage skills affect the attendance’ is the most common disadvantage. Participants' most obvious problem during virtual lectures is ‘lack of technical support’. Finally ‘easy to use’, ‘support possibilities’, ‘communication level’ and ‘flexibility’ come to the forefront in the scope of expected features of virtual classrooms. Last of all, students' opinions about the virtual classrooms seems to be generally positive. Designing and managing virtual classrooms according to the prioritized features will increase the students’ satisfaction and will contribute to improve applications that are more effective.

Keywords: distance education, virtual classrooms, higher education, e-learning

Procedia PDF Downloads 269
4464 Competitiveness of African Countries through Open Quintuple Helix Model

Authors: B. G. C. Ahodode, S. Fekkaklouhail

Abstract:

Following the triple helix theory, this study aims to evaluate the innovation system effect on African countries’ competitiveness by taking into account external contributions; according to the extent that developing countries (especially African countries) are characterized by weak innovation systems whose synergy operates more at the foreign level than domestic and global. To do this, we used the correlation test, parsimonious regression techniques, and panel estimation between 2013 and 2016. Results show that the degree of innovation synergy has a significant effect on competitiveness in Africa. Specifically, while the opening system (OPESYS) and social system (SOCSYS) contribute respectively in importance order to 0.634 and 0.284 (at 1%) significant points of increase in the GCI, the political system (POLSYS) and educational system (EDUSYS) only increase it to 0.322 and 0.169 at 5% significance level while the effect of the economic system (ECOSYS) is not significant on Global Competitiveness Index.

Keywords: innovation system, innovation, competitiveness, Africa

Procedia PDF Downloads 69
4463 Video Based Ambient Smoke Detection By Detecting Directional Contrast Decrease

Authors: Omair Ghori, Anton Stadler, Stefan Wilk, Wolfgang Effelsberg

Abstract:

Fire-related incidents account for extensive loss of life and material damage. Quick and reliable detection of occurring fires has high real world implications. Whereas a major research focus lies on the detection of outdoor fires, indoor camera-based fire detection is still an open issue. Cameras in combination with computer vision helps to detect flames and smoke more quickly than conventional fire detectors. In this work, we present a computer vision-based smoke detection algorithm based on contrast changes and a multi-step classification. This work accelerates computer vision-based fire detection considerably in comparison with classical indoor-fire detection.

Keywords: contrast analysis, early fire detection, video smoke detection, video surveillance

Procedia PDF Downloads 447
4462 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem

Authors: C. E. Nugraheni, L. Abednego

Abstract:

This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as meta-heuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.

Keywords: hyper-heuristics, evolutionary algorithms, production scheduling, meta-heuristic

Procedia PDF Downloads 381
4461 Chatbots in Education: Case of Development Using a Chatbot Development Platform

Authors: Dulani Jayasuriya

Abstract:

This study outlines the developmental steps of a chatbot for administrative purposes of a large undergraduate course. The chatbot is able to handle student queries about administrative details, including assessment deadlines, course documentation, how to navigate the course, group formation, etc. The development window screenshots are that of a free account on the Snatchbot platform such that this can be adopted by the wider public. While only one connection to an answer based on possible keywords is shown here, one needs to develop multiple connections leading to different answers based on different keywords for the actual chatbot to function. The overall flow of the chatbot showing connections between different interactions is depicted at the end.

Keywords: chatbots, education, technology, snatch bot, artificial intelligence

Procedia PDF Downloads 104
4460 Anesthesia for Spinal Stabilization Using Neuromuscular Blocking Agents in Dog: Case Report

Authors: Agata Migdalska, Joanna Berczynska, Ewa Bieniek, Jacek Sterna

Abstract:

Muscle relaxation is considered important during general anesthesia for spine stabilization. In a presented case peripherally acting muscle relaxant was applied during general anesthesia for spine stabilization surgery. The patient was a dog, 11-years old, 26 kg, male, mix breed. Spine fracture was situated between Th13-L1-L2, probably due to the car accident. Preanesthetic physical examination revealed no sign underlying health issues. The dog was premedicated with midazolam 0.2 mg IM and butorphanol 2.4 mg IM. General anesthesia was induced with propofol IV. After the induction, the dog was intubated with an endotracheal tube and connected to an open-ended rebreathing system and maintained with the use of inhalation anesthesia with isoflurane in oxygen. 0,5 mg/ kg of rocuronium was given IV. Use of muscle relaxant was accompanied by an assessment of the degree of neuromuscular blockade by peripheral nerve stimulator. Electrodes were attached to the skin overlying at the peroneal nerve at the lateral cranial tibia. Four electrical pulses were applied to the nerve over a 2 second period. When satisfying nerve block was detected dog was prepared for the surgery. No further monitoring of the effectiveness of blockade was performed during surgery. Mechanical ventilation was kept during anesthesia. During surgery dog maintain stable, and no anesthesiological complication occur. Intraoperatively surgeon claimed that neuromuscular blockade results in a better approach to the spine and easier muscle manipulation which was helpful in order to see the fracture and replace bone fragments. Finally, euthanasia was performed intraoperatively as a result of vast myelomalacia process of the spinal cord. This prevented examination of the recovering process. Neuromuscular blocking agents act at the neuromuscular junction to provide profound muscle relaxation throughout the body. Muscle blocking agents are neither anesthetic nor analgesic; therefore inappropriately used may cause paralysis in fully conscious and feeling pain patient. They cause paralysis of all skeletal muscles, also diaphragm and intercostal muscles when given in higher doses. Intraoperative management includes maintaining stable physiological conditions, which involves adjusting hemodynamic parameters, ensuring proper ventilation, avoiding variations in temperature, maintain normal blood flow to promote proper oxygen exchange. Neuromuscular blocking agent can cause many side effects like residual paralysis, anaphylactic or anaphylactoid reactions, delayed recovery from anesthesia, histamine release, recurarization. Therefore reverse drug like neostigmine (with glikopyrolat) or edrofonium (with atropine) should be used in case of a life-threatening situation. Another useful drug is sugammadex, although the cost of this drug strongly limits its use. Muscle relaxant improves surgical conditions during spinal surgery, especially in heavily muscled individuals. They are also used to facilitate the replacement of dislocated joints as they improve conditions during fracture reduction. It is important to emphasize that in a patient with muscle weakness neuromuscular blocking agents may result in intraoperative and early postoperative cardiovascular and respiratory complications, as well as prolonged recovery from anesthesia. This should not appear in patients with recent spine fracture or luxation. Therefore it is believed that neuromuscular blockers could be useful during spine stabilization procedures.

Keywords: anesthesia, dog, neuromuscular block, spine surgery

Procedia PDF Downloads 181
4459 Characterization of Crustin from Litopenaeus vannamei

Authors: Suchao Donpudsa, Anchalee Tassanakajon, Vichien Rimphanitchayakit

Abstract:

A crustin gene, LV-SWD1, previously found in the hemocyte cDNA library of Litopenaeus vannamei, contains the open reading frames of 288 bp encoding a putative protein of 96 amino acid residues. The putative signal peptides of the LV-SWD1 were identified using the online SignalP 3.0 with predicted cleavage sites between Ala24-Val25, resulting in 72 residue mature protein with calculated molecular mass of 7.4 kDa and predicted pI of 8.5. This crustin contains a Arg-Pro rich region at the amino-terminus and a single whey acidic protein (WAP) domain at the carboxyl-terminus. In order to characterize their properties and biological activities, the recombinant crustin protein was produced in the Escherichia coli expression system. Antimicrobial assays showed that the growth of Bacillus subtilis was inhibited by this recombinant crustin with MIC of about 25-50 µM.

Keywords: crustin, single whey acidic protein, Litopenaeus vannamei, antimicrobial activity

Procedia PDF Downloads 244
4458 The Boundary Element Method in Excel for Teaching Vector Calculus and Simulation

Authors: Stephen Kirkup

Abstract:

This paper discusses the implementation of the boundary element method (BEM) on an Excel spreadsheet and how it can be used in teaching vector calculus and simulation. There are two separate spreadheets, within which Laplace equation is solved by the BEM in two dimensions (LIBEM2) and axisymmetric three dimensions (LBEMA). The main algorithms are implemented in the associated programming language within Excel, Visual Basic for Applications (VBA). The BEM only requires a boundary mesh and hence it is a relatively accessible method. The BEM in the open spreadsheet environment is demonstrated as being useful as an aid to teaching and learning. The application of the BEM implemented on a spreadsheet for educational purposes in introductory vector calculus and simulation is explored. The development of assignment work is discussed, and sample results from student work are given. The spreadsheets were found to be useful tools in developing the students’ understanding of vector calculus and in simulating heat conduction.

Keywords: boundary element method, Laplace’s equation, vector calculus, simulation, education

Procedia PDF Downloads 163
4457 Approach-Avoidance Conflict in the T-Maze: Behavioral Validation for Frontal EEG Activity Asymmetries

Authors: Eva Masson, Andrea Kübler

Abstract:

Anxiety disorders (AD) are the most prevalent psychological disorders. However, far from most affected individuals are diagnosed and receive treatment. This gap is probably due to the diagnosis criteria, relying on symptoms (according to the DSM-5 definition) with no objective biomarker. Approach-avoidance conflict tasks are one common approach to simulate such disorders in a lab setting, with most of the paradigms focusing on the relationships between behavior and neurophysiology. Approach-avoidance conflict tasks typically place participants in a situation where they have to make a decision that leads to both positive and negative outcomes, thereby sending conflicting signals that trigger the Behavioral Inhibition System (BIS). Furthermore, behavioral validation of such paradigms adds credibility to the tasks – with overt conflict behavior, it is safer to assume that the task actually induced a conflict. Some of those tasks have linked asymmetrical frontal brain activity to induced conflicts and the BIS. However, there is currently no consensus for the direction of the frontal activation. The authors present here a modified version of the T-Maze paradigm, a motivational conflict desktop task, in which behavior is recorded simultaneously to the recording of high-density EEG (HD-EEG). Methods: In this within-subject design, HD-EEG and behavior of 35 healthy participants was recorded. EEG data was collected with a 128 channels sponge-based system. The motivational conflict desktop task consisted of three blocks of repeated trials. Each block was designed to record a slightly different behavioral pattern, to increase the chances of eliciting conflict. This variety of behavioral patterns was however similar enough to allow comparison of the number of trials categorized as ‘overt conflict’ between the blocks. Results: Overt conflict behavior was exhibited in all blocks, but always for under 10% of the trials, in average, in each block. However, changing the order of the paradigms successfully introduced a ‘reset’ of the conflict process, therefore providing more trials for analysis. As for the EEG correlates, the authors expect a different pattern for trials categorized as conflict, compared to the other ones. More specifically, we expect an elevated alpha frequency power in the left frontal electrodes at around 200ms post-cueing, compared to the right one (relative higher right frontal activity), followed by an inversion around 600ms later. Conclusion: With this comprehensive approach of a psychological mechanism, new evidence would be brought to the frontal asymmetry discussion, and its relationship with the BIS. Furthermore, with the present task focusing on a very particular type of motivational approach-avoidance conflict, it would open the door to further variations of the paradigm to introduce different kinds of conflicts involved in AD. Even though its application as a potential biomarker sounds difficult, because of the individual reliability of both the task and peak frequency in the alpha range, we hope to open the discussion for task robustness for neuromodulation and neurofeedback future applications.

Keywords: anxiety, approach-avoidance conflict, behavioral inhibition system, EEG

Procedia PDF Downloads 38
4456 Urban Planning Compilation Problems in China and the Corresponding Optimization Ideas under the Vision of the Hyper-Cycle Theory

Authors: Hong Dongchen, Chen Qiuxiao, Wu Shuang

Abstract:

Systematic science reveals the complex nonlinear mechanisms of behaviour in urban system. However, in China, when the current city planners face with the system, most of them are still taking simple linear thinking to consider the open complex giant system. This paper introduces the hyper-cycle theory, which is one of the basis theories of systematic science, based on the analysis of the reasons why the current urban planning failed, and proposals for optimization ideas that urban planning compilation should change, from controlling quantitative to the changes of relationship, from blueprint planning to progressive planning based on the nonlinear characteristics and from management control to dynamically monitor feedback.

Keywords: systematic science, hyper-cycle theory, urban planning, urban management

Procedia PDF Downloads 406
4455 Hand Gestures Based Emotion Identification Using Flex Sensors

Authors: S. Ali, R. Yunus, A. Arif, Y. Ayaz, M. Baber Sial, R. Asif, N. Naseer, M. Jawad Khan

Abstract:

In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications.

Keywords: emotion identification, emotion models, gesture recognition, user perception

Procedia PDF Downloads 285
4454 Comparison of On-Site Stormwater Detention Real Performance and Theoretical Simulations

Authors: Pedro P. Drumond, Priscilla M. Moura, Marcia M. L. P. Coelho

Abstract:

The purpose of On-site Stormwater Detention (OSD) system is to promote the detention of addition stormwater runoff caused by impervious areas, in order to maintain the peak flow the same as the pre-urbanization condition. In recent decades, these systems have been built in many cities around the world. However, its real efficiency continues to be unknown due to the lack of research, especially with regard to monitoring its real performance. Thus, this study aims to compare the water level monitoring data of an OSD built in Belo Horizonte/Brazil with the results of theoretical methods simulations, usually adopted in OSD design. There were made two theoretical simulations, one using the Rational Method and Modified Puls method and another using the Soil Conservation Service (SCS) method and Modified Puls method. The monitoring data were obtained with a water level sensor, installed inside the reservoir and connected to a data logger. The comparison of OSD performance was made for 48 rainfall events recorded from April/2015 to March/2017. The comparison of maximum water levels in the OSD showed that the results of the simulations with Rational/Puls and SCS/Puls methods were, on average 33% and 73%, respectively, lower than those monitored. The Rational/Puls results were significantly higher than the SCS/Puls results, only in the events with greater frequency. In the events with average recurrence interval of 5, 10 and 200 years, the maximum water heights were similar in both simulations. Also, the results showed that the duration of rainfall events was close to the duration of monitored hydrograph. The rising time and recession time of the hydrographs calculated with the Rational Method represented better the monitored hydrograph than SCS Method. The comparison indicates that the real discharge coefficient value could be higher than 0.61, adopted in Puls simulations. New researches evaluating OSD real performance should be developed. In order to verify the peak flow damping efficiency and the value of the discharge coefficient is necessary to monitor the inflow and outflow of an OSD, in addition to monitor the water level inside it.

Keywords: best management practices, on-site stormwater detention, source control, urban drainage

Procedia PDF Downloads 188
4453 Prediction of Finned Projectile Aerodynamics Using a Lattice-Boltzmann Method CFD Solution

Authors: Zaki Abiza, Miguel Chavez, David M. Holman, Ruddy Brionnaud

Abstract:

In this paper, the prediction of the aerodynamic behavior of the flow around a Finned Projectile will be validated using a Computational Fluid Dynamics (CFD) solution, XFlow, based on the Lattice-Boltzmann Method (LBM). XFlow is an innovative CFD software developed by Next Limit Dynamics. It is based on a state-of-the-art Lattice-Boltzmann Method which uses a proprietary particle-based kinetic solver and a LES turbulent model coupled with the generalized law of the wall (WMLES). The Lattice-Boltzmann method discretizes the continuous Boltzmann equation, a transport equation for the particle probability distribution function. From the Boltzmann transport equation, and by means of the Chapman-Enskog expansion, the compressible Navier-Stokes equations can be recovered. However to simulate compressible flows, this method has a Mach number limitation because of the lattice discretization. Thanks to this flexible particle-based approach the traditional meshing process is avoided, the discretization stage is strongly accelerated reducing engineering costs, and computations on complex geometries are affordable in a straightforward way. The projectile that will be used in this work is the Army-Navy Basic Finned Missile (ANF) with a caliber of 0.03 m. The analysis will consist in varying the Mach number from M=0.5 comparing the axial force coefficient, normal force slope coefficient and the pitch moment slope coefficient of the Finned Projectile obtained by XFlow with the experimental data. The slope coefficients will be obtained using finite difference techniques in the linear range of the polar curve. The aim of such an analysis is to find out the limiting Mach number value starting from which the effects of high fluid compressibility (related to transonic flow regime) lead the XFlow simulations to differ from the experimental results. This will allow identifying the critical Mach number which limits the validity of the isothermal formulation of XFlow and beyond which a fully compressible solver implementing a coupled momentum-energy equations would be required.

Keywords: CFD, computational fluid dynamics, drag, finned projectile, lattice-boltzmann method, LBM, lift, mach, pitch

Procedia PDF Downloads 421
4452 Urban Design as a Tool in Disaster Resilience and Urban Hazard Mitigation: Case of Cochin, Kerala, India

Authors: Vinu Elias Jacob, Manoj Kumar Kini

Abstract:

Disasters of all types are occurring more frequently and are becoming more costly than ever due to various manmade factors including climate change. A better utilisation of the concept of governance and management within disaster risk reduction is inevitable and of utmost importance. There is a need to explore the role of pre- and post-disaster public policies. The role of urban planning/design in shaping the opportunities of households, individuals and collectively the settlements for achieving recovery has to be explored. Governance strategies that can better support the integration of disaster risk reduction and management has to be examined. The main aim is to thereby build the resilience of individuals and communities and thus, the states too. Resilience is a term that is usually linked to the fields of disaster management and mitigation, but today has become an integral part of planning and design of cities. Disaster resilience broadly describes the ability of an individual or community to 'bounce back' from disaster impacts, through improved mitigation, preparedness, response, and recovery. The growing population of the world has resulted in the inflow and use of resources, creating a pressure on the various natural systems and inequity in the distribution of resources. This makes cities vulnerable to multiple attacks by both natural and man-made disasters. Each urban area needs elaborate studies and study based strategies to proceed in the discussed direction. Cochin in Kerala is the fastest and largest growing city with a population of more than 26 lakhs. The main concern that has been looked into in this paper is making cities resilient by designing a framework of strategies based on urban design principles for an immediate response system especially focussing on the city of Cochin, Kerala, India. The paper discusses, understanding the spatial transformations due to disasters and the role of spatial planning in the context of significant disasters. The paper also aims in developing a model taking into consideration of various factors such as land use, open spaces, transportation networks, physical and social infrastructure, building design, and density and ecology that can be implemented in any city of any context. Guidelines are made for the smooth evacuation of people through hassle-free transport networks, protecting vulnerable areas in the city, providing adequate open spaces for shelters and gatherings, making available basic amenities to affected population within reachable distance, etc. by using the tool of urban design. Strategies at the city level and neighbourhood level have been developed with inferences from vulnerability analysis and case studies.

Keywords: disaster management, resilience, spatial planning, spatial transformations

Procedia PDF Downloads 296
4451 eTransformation Framework for the Cognitive Systems

Authors: Ana Hol

Abstract:

Digital systems are in the cognitive wave of the eTransformations and are now extensively aimed at meeting the individuals’ demands, both those of customers requiring services and those of service providers. It is also apparent that successful future systems will not just simply open doors to the traditional owners/users to offer and receive services such as Uber for example does today, but will in the future require more customized and cognitively enabled infrastructures that will be responsive to the system user’s needs. To be able to identify what is required for such systems, this research reviews the historical and the current effects of the eTransformation process by studying: 1. eTransitions of company websites and mobile applications, 2. Emergence of new sheared economy business models as Uber and, 3. New requirements for demand driven, cognitive systems capable of learning and just in time decision making. Based on the analysis, this study proposes a Cognitive eTransformation Framework capable of guiding implementations of new responsive and user aware systems.

Keywords: system implementations, AI supported systems, cognitive systems, eTransformation

Procedia PDF Downloads 238
4450 Active Filtration of Phosphorus in Ca-Rich Hydrated Oil Shale Ash Filters: The Effect of Organic Loading and Form of Precipitated Phosphatic Material

Authors: Päärn Paiste, Margit Kõiv, Riho Mõtlep, Kalle Kirsimäe

Abstract:

For small-scale wastewater management, the treatment wetlands (TWs) as a low cost alternative to conventional treatment facilities, can be used. However, P removal capacity of TW systems is usually problematic. P removal in TWs is mainly dependent on the physico–chemical and hydrological properties of the filter material. Highest P removal efficiency has been shown trough Ca-phosphate precipitation (i.e. active filtration) in Ca-rich alkaline filter materials, e.g. industrial by-products like hydrated oil shale ash (HOSA), metallurgical slags. In this contribution we report preliminary results of a full-scale TW system using HOSA material for P removal for a municipal wastewater at Nõo site, Estonia. The main goals of this ongoing project are to evaluate: a) the long-term P removal efficiency of HOSA using real waste water; b) the effect of high organic loading rate; c) variable P-loading effects on the P removal mechanism (adsorption/direct precipitation); and d) the form and composition of phosphate precipitates. Onsite full-scale experiment with two concurrent filter systems for treatment of municipal wastewater was established in September 2013. System’s pretreatment steps include septic tank (2 m2) and vertical down-flow LECA filters (3 m2 each), followed by horizontal subsurface HOSA filters (effective volume 8 m3 each). Overall organic and hydraulic loading rates of both systems are the same. However, the first system is operated in a stable hydraulic loading regime and the second in variable loading regime that imitates the wastewater production in an average household. Piezometers for water and perforated sample containers for filter material sampling were incorporated inside the filter beds to allow for continuous in-situ monitoring. During the 18 months of operation the median removal efficiency (inflow to outflow) of both systems were over 99% for TP, 93% for COD and 57% for TN. However, we observed significant differences in the samples collected in different points inside the filter systems. In both systems, we observed development of preferred flow paths and zones with high and low loadings. The filters show formation and a gradual advance of a “dead” zone along the flow path (zone with saturated filter material characterized by ineffective removal rates), which develops more rapidly in the system working under variable loading regime. The formation of the “dead” zone is accompanied by the growth of organic substances on the filter material particles that evidently inhibit the P removal. Phase analysis of used filter materials using X-ray diffraction method reveals formation of minor amounts of amorphous Ca-phosphate precipitates. This finding is supported by ATR-FTIR and SEM-EDS measurements, which also reveal Ca-phosphate and authigenic carbonate precipitation. Our first experimental results demonstrate that organic pollution and loading regime significantly affect the performance of hydrated ash filters. The material analyses also show that P is incorporated into a carbonate substituted hydroxyapatite phase.

Keywords: active filtration, apatite, hydrated oil shale ash, organic pollution, phosphorus

Procedia PDF Downloads 274
4449 Experimental Study on Dehumidification Performance of Supersonic Nozzle

Authors: Esam Jassim

Abstract:

Supersonic nozzles are commonly used to purify natural gas in gas processing technology. As an innovated technology, it is employed to overcome the deficit of the traditional method, related to gas dynamics, thermodynamics and fluid dynamics theory. An indoor test rig is built to study the dehumidification process of moisture fluid. Humid air was chosen for the study. The working fluid was circulating in an open loop, which had provision for filtering, metering, and humidifying. A stainless steel supersonic separator is constructed together with the C-D nozzle system. The result shows that dehumidification enhances as NPR increases. This is due to the high intensity in the turbulence caused by the shock formation in the divergent section. Such disturbance strengthens the centrifugal force, pushing more particles toward the near-wall region. In return return, the pressure recovery factor, defined as the ratio of the outlet static pressure of the fluid to its inlet value, decreases with NPR.

Keywords: supersonic nozzle, dehumidification, particle separation, nozzle geometry

Procedia PDF Downloads 339
4448 Global Analysis in a Growth Economic Model with Perfect-Substitution Technologies

Authors: Paolo Russu

Abstract:

The purpose of the present paper is to highlight some features of an economic growth model with environmental negative externalities, giving rise to a three-dimensional dynamic system. In particular, we show that the economy, which is based on a Perfect-Substitution Technologies function of production, has no neither indeterminacy nor poverty trap. This implies that equilibrium select by economy depends on the history (initial values of state variable) of the economy rather than on expectations of economies agents. Moreover, by contrast, we prove that the basin of attraction of locally equilibrium points may be very large, as they can extend up to the boundary of the system phase space. The infinite-horizon optimal control problem has the purpose of maximizing the representative agent’s instantaneous utility function depending on leisure and consumption.

Keywords: Hopf bifurcation, open-access natural resources, optimal control, perfect-substitution technologies, Poincarè compactification

Procedia PDF Downloads 172
4447 A Formal Verification Approach for Linux Kernel Designing

Authors: Zi Wang, Xinlei He, Jianghua Lv, Yuqing Lan

Abstract:

Kernel though widely used, is complicated. Errors caused by some bugs are often costly. Statically, more than half of the mistakes occur in the design phase. Thus, we introduce a modeling method, KMVM (Linux Kernel Modeling and verification Method), based on type theory for proper designation and correct exploitation of the Kernel. In the model, the Kernel is separated into six levels: subsystem, dentry, file, struct, func, and base. Each level is treated as a type. The types are specified in the structure and relationship. At the same time, we use a demanding path to express the function to be implemented. The correctness of the design is verified by recursively checking the type relationship and type existence. The method has been applied to verify the OPEN business of VFS (virtual file system) in Linux Kernel. Also, we have designed and developed a set of security communication mechanisms in the Kernel with verification.

Keywords: formal approach, type theory, Linux Kernel, software program

Procedia PDF Downloads 137
4446 Barriers to Entry: The Pitfall of Charter School Accountability

Authors: Ian Kingsbury

Abstract:

The rapid expansion of charter schools (public schools that receive government but do not face the same regulations as traditional public schools) over the preceding two decades has raised concerns over the potential for graft and fraud. These concerns are largely justified: Incidents of financial crime and mismanagement are not unheard of, and the charter sector has become a darling of hedge fund managers. In response, several states have strengthened their charter school regulatory regimes. Imposing regulations and attempting to increase accountability seem like sensible measures, and perhaps they are necessary. However, increased regulation may come at the cost of imposing barriers to entry. Specifically, increased regulation often entails evidence for a high likelihood of fiscal solvency. That should theoretically entail access to capital in the short-term, which may systematically preclude Black or Hispanic applicants from opening charter schools. Moreover, increased regulation necessarily entails more red tape. The institutional wherewithal and the number of hours required to complete an application to open a charter school might favor those who have partnered with an education service provider, specifically a charter management organization (CMO) or education management organization (EMO). These potential barriers to entry pose a significant policy concern. Just as policymakers hope to increase the share of minority teachers and principals, they should sensibly care whether individuals who open charter schools look like the students in that school. Moreover, they might be concerned if successful applications in states with stringent regulations are overwhelmingly affiliated with education service providers. One of the original missions of charter schools was to serve as a laboratory of innovation. Approving only those applications affiliated with education service providers (and in effect establishing a parallel network of schools rather than a diverse marketplace of schools) undermines that mission. Data and methods: The analysis examines more than 2,000 charter school applications from 15 states. It compares the outcomes of applications from states with a strong regulatory environment (those with high scores) from NACSA-the National Association of Charter School Authorizers- to applications from states with a weak regulatory environment (those with a low NACSA score). If the hypothesis is correct, applicants not affiliated with an ESP are more likely to be rejected in high-regulation states compared to those affiliated with an ESP, and minority candidates not affiliated with an education service provider (ESP) are particularly likely to be rejected. Initial returns indicate that the hypothesis holds. More applications in low NASCA-scoring Arizona come from individuals not associated with an ESP, and those individuals are as likely to be accepted as those affiliated with an ESP. On the other hand, applicants in high-NACSA scoring Indiana and Ohio are more than 20 percentage points more likely to be accepted if they are affiliated with an ESP, and the effect is particularly pronounced for minority candidates. These findings should spur policymakers to consider the drawbacks of charter school accountability and consider accountability regimes that do not impose barriers to entry.

Keywords: accountability, barriers to entry, charter schools, choice

Procedia PDF Downloads 159
4445 Modelling Railway Noise Over Large Areas, Assisted by GIS

Authors: Conrad Weber

Abstract:

The modelling of railway noise over large projects areas can be very time consuming in terms of preparing the noise models and calculation time. An open-source GIS program has been utilised to assist with the modelling of operational noise levels for 675km of railway corridor. A range of GIS algorithms were utilised to break up the noise model area into manageable calculation sizes. GIS was utilised to prepare and filter a range of noise modelling inputs, including building files, land uses and ground terrain. A spreadsheet was utilised to manage the accuracy of key input parameters, including train speeds, train types, curve corrections, bridge corrections and engine notch settings. GIS was utilised to present the final noise modelling results. This paper explains the noise modelling process and how the spreadsheet and GIS were utilised to accurately model this massive project efficiently.

Keywords: noise, modeling, GIS, rail

Procedia PDF Downloads 122