Search results for: task cycles
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2889

Search results for: task cycles

459 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks

Authors: Van Trieu, Shouhuai Xu, Yusheng Feng

Abstract:

Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.

Keywords: causality, multilevel graph, cyber-attacks, prediction

Procedia PDF Downloads 156
458 Learning, Teaching and Assessing Students’ ESP Skills via Exe and Hot Potatoes Software Programs

Authors: Naira Poghosyan

Abstract:

In knowledge society the content of the studies, the methods used and the requirements for an educator’s professionalism regularly undergo certain changes. It follows that in knowledge society the aim of education is not only to educate professionals for a certain field but also to help students to be aware of cultural values, form human mutual relationship, collaborate, be open, adapt to the new situation, creatively express their ideas, accept responsibility and challenge. In this viewpoint, the development of communicative language competence requires a through coordinated approach to ensure proper comprehension and memorization of subject-specific words starting from high school level. On the other hand, ESP (English for Specific Purposes) teachers and practitioners are increasingly faced with the task of developing and exploiting new ways of assessing their learners’ literacy while learning and teaching ESP. The presentation will highlight the latest achievements in this field. The author will present some practical methodological issues and principles associated with learning, teaching and assessing ESP skills of the learners, using the two software programs of EXE 2.0 and Hot Potatoes 6. On the one hand the author will display the advantages of the two programs as self-learning and self-assessment interactive tools in the course of academic study and professional development of the CLIL learners, on the other hand, she will comprehensively shed light upon some methodological aspects of working out appropriate ways of selection, introduction, consolidation of subject specific materials via EXE 2.0 and Hot Potatoes 6. Then the author will go further to distinguish ESP courses by the general nature of the learners’ specialty identifying three large categories of EST (English for Science and Technology), EBE (English for Business and Economics) and ESS (English for the Social Sciences). The cornerstone of the presentation will be the introduction of the subject titled “The methodology of teaching ESP in non-linguistic institutions”, where a unique case of teaching ESP on Architecture and Construction via EXE 2.0 and Hot Potatoes 6 will be introduced, exemplifying how the introduction, consolidation and assessment can be used as a basis for feedback to the ESP learners in a particular professional field.

Keywords: ESP competences, ESP skill assessment/ self-assessment tool, eXe 2.0 / HotPotatoes software program, ESP teaching strategies and techniques

Procedia PDF Downloads 378
457 The SHIFT of Consumer Behavior from Fast Fashion to Slow Fashion: A Review and Research Agenda

Authors: Priya Nangia, Sanchita Bansal

Abstract:

As fashion cycles become more rapid, some segments of the fashion industry have adopted increasingly unsustainable production processes to keep up with demand and enhance profit margins. The growing threat to environmental and social wellbeing posed by unethical fast fashion practices and the need to integrate the targets of SDGs into this industry necessitates a shift in the fashion industry's unsustainable nature, which can only be accomplished in the long run if consumers support sustainable fashion by purchasing it. Fast fashion is defined as low-cost, trendy apparel that takes inspiration from the catwalk or celebrity culture and rapidly transforms it into garments at high-street stores to meet consumer demand. Given the importance of identity formation to many consumers, the desire to be “fashionable” often outweighs the desire to be ethical or sustainable. This paradox exemplifies the tension between the human drive to consume and the will to do so in moderation. Previous research suggests that there is an attitude-behavior gap when it comes to determining consumer purchasing behavior, but to the best of our knowledge, no study has analysed how to encourage customers to shift from fast to slow fashion. Against this backdrop, the aim of this study is twofold: first, to identify and examine the factors that impact consumers' decisions to engage in sustainable fashion, and second, the authors develop a comprehensive framework for conceptualizing and encouraging researchers and practitioners to foster sustainable consumer behavior. This study used a systematic approach to collect data and analyse literature. The approach included three key steps: review planning, review execution, and findings reporting. Authors identified the keywords “sustainable consumption” and “sustainable fashion” and retrieved studies from the Web of Science (WoS) (126 records) and Scopus database (449 records). To make the study more specific, the authors refined the subject area to management, business, and economics in the second step, retrieving 265 records. In the third step, the authors removed the duplicate records and manually reviewed the articles to examine their relevance to the research issue. The final 96 research articles were used to develop this study's systematic scheme. The findings indicate that societal norms, demographics, positive emotions, self-efficacy, and awareness all have an effect on customers' decisions to purchase sustainable apparel. The authors propose a framework, denoted by the acronym SHIFT, in which consumers are more likely to engage in sustainable behaviors when the message or context leverages the following factors: (s)social influence, (h)habit formation, (i)individual self, (f)feelings, emotions, and cognition, and (t)tangibility. Furthermore, the authors identify five broad challenges that encourage sustainable consumer behavior and use them to develop novel propositions. Finally, the authors discuss how the SHIFT framework can be used in practice to drive sustainable consumer behaviors. This research sought to define the boundaries of existing research while also providing new perspectives on future research, with the goal of being useful for the development and discovery of new fields of study, thereby expanding knowledge.

Keywords: consumer behavior, fast fashion, sustainable consumption, sustainable fashion, systematic literature review

Procedia PDF Downloads 90
456 Map UI Design of IoT Application Based on Passenger Evacuation Behaviors in Underground Station

Authors: Meng-Cong Zheng

Abstract:

When the public space is in an emergency, how to quickly establish spatial cognition and emergency shelter in the closed underground space is the urgent task. This study takes Taipei Station as the research base and aims to apply the use of Internet of things (IoT) application for underground evacuation mobility design. The first experiment identified passengers' evacuation behaviors and spatial cognition in underground spaces by wayfinding tasks and thinking aloud, then defined the design conditions of User Interface (UI) and proposed the UI design.  The second experiment evaluated the UI design based on passengers' evacuation behaviors by wayfinding tasks and think aloud again as same as the first experiment. The first experiment found that the design conditions that the subjects were most concerned about were "map" and hoping to learn the relative position of themselves with other landmarks by the map and watch the overall route. "Position" needs to be accurately labeled to determine the location in underground space. Each step of the escape instructions should be presented clearly in "navigation bar." The "message bar" should be informed of the next or final target exit. In the second experiment with the UI design, we found that the "spatial map" distinguishing between walking and non-walking areas with shades of color is useful. The addition of 2.5D maps of the UI design increased the user's perception of space. Amending the color of the corner diagram in the "escape route" also reduces the confusion between the symbol and other diagrams. The larger volume of toilets and elevators can be a judgment of users' relative location in "Hardware facilities." Fire extinguisher icon should be highlighted. "Fire point tips" of the UI design indicated fire with a graphical fireball can convey precise information to the escaped person. "Fire point tips" of the UI design indicated fire with a graphical fireball can convey precise information to the escaped person. However, "Compass and return to present location" are less used in underground space.

Keywords: evacuation behaviors, IoT application, map UI design, underground station

Procedia PDF Downloads 207
455 Degradation of Commercial Polychlorinated Biphenyl Mixture by Naturally Occurring Facultative Microorganisms via Anaerobic Dechlorination and Aerobic Oxidation

Authors: P. M. G. Pathiraja, P. Egodawatta, A. Goonetilleke, V. S. J. Te'o

Abstract:

The production and use of Polychlorinated biphenyls (PCBs), a group of synthetic halogenated hydrocarbons have been restricted worldwide due to its toxicity and categorized as one of the twelve priority persistent organic pollutants (POP) by the Stockholm Convention. Low reactivity and high chemical stability of PCBs have made them highly persistent in the environment and bio-concentration and bio-magnification along the food chain contribute to multiple health impacts in humans and animals. Remediating environments contaminated with PCBs is a challenging task for decades. Use of microorganisms for remediation of PCB contaminated soils and sediments have been widely investigated due to the potential of breakdown these complex contaminants with minimum environmental impacts. To achieve an effective bioremediation of polychlorinated biphenyls (PCBs) contaminated environments, microbes were sourced from environmental samples and tested for their ability to hydrolyze PCBs under different conditions. Comparison of PCB degradation efficiencies of four naturally occurring facultative bacterial cultures isolated through selective enrichment under aerobic and anaerobic conditions were simultaneously investigated in minimal salt medium using 50 mg/L Aroclor 1260, a commonly used commercial PCB mixture as the sole source of carbon. The results of a six-week study demonstrated that all the tested facultative Achromobacter, Ochrobactrum, Lysinibacillus and Pseudomonas strains are capable of degrading PCBs under both anaerobic and aerobic conditions while assisting hydrophobic PCBs to make solubilize in the aqueous minimal medium. Overall, the results suggest that some facultative bacteria are capable of effective in degrading PCBs under anaerobic conditions through reductive dechlorination and under aerobic conditions through oxidation. Therefore, use of suitable facultative microorganisms under combined anaerobic-aerobic conditions and combination of such strains capable of solubilization and breakdown of PCBs has high potential in achieving higher PCB removal rates.

Keywords: bioremediation, combined anaerobic-aerobic degradation, facultative microorganisms, polychlorinated biphenyls

Procedia PDF Downloads 241
454 3D Codes for Unsteady Interaction Problems of Continuous Mechanics in Euler Variables

Authors: M. Abuziarov

Abstract:

The designed complex is intended for the numerical simulation of fast dynamic processes of interaction of heterogeneous environments susceptible to the significant formability. The main challenges in solving such problems are associated with the construction of the numerical meshes. Currently, there are two basic approaches to solve this problem. One is using of Lagrangian or Lagrangian Eulerian grid associated with the boundaries of media and the second is associated with the fixed Eulerian mesh, boundary cells of which cut boundaries of the environment medium and requires the calculation of these cut volumes. Both approaches require the complex grid generators and significant time for preparing the code’s data for simulation. In this codes these problems are solved using two grids, regular fixed and mobile local Euler Lagrange - Eulerian (ALE approach) accompanying the contact and free boundaries, the surfaces of shock waves and phase transitions, and other possible features of solutions, with mutual interpolation of integrated parameters. For modeling of both liquids and gases, and deformable solids the Godunov scheme of increased accuracy is used in Lagrangian - Eulerian variables, the same for the Euler equations and for the Euler- Cauchy, describing the deformation of the solid. The increased accuracy of the scheme is achieved by using 3D spatial time dependent solution of the discontinuity problem (3D space time dependent Riemann's Problem solver). The same solution is used to calculate the interaction at the liquid-solid surface (Fluid Structure Interaction problem). The codes does not require complex 3D mesh generators, only the surfaces of the calculating objects as the STL files created by means of engineering graphics are given by the user, which greatly simplifies the preparing the task and makes it convenient to use directly by the designer at the design stage. The results of the test solutions and applications related to the generation and extension of the detonation and shock waves, loading the constructions are presented.

Keywords: fluid structure interaction, Riemann's solver, Euler variables, 3D codes

Procedia PDF Downloads 439
453 Detection of Autism Spectrum Disorders in Children Aged 4-6 Years by Municipal Maternal and Child Health Physicians: An Educational Intervention Study

Authors: M. Van 'T Hof, R. V. Pasma, J. T. Bailly, H. W. Hoek, W. A. Ester

Abstract:

Background: The transition into primary school can be challenging for children with an autism spectrum disorder (ASD). Due to the new demands that are made to children in this period, their limitations in social functioning and school achievements may manifest and appear faster. Detection of possible ASD signals mainly takes place by parents, teachers and during obligatory municipal maternal and child health centre visits. Physicians of municipal maternal and child health centres have limited education and instruments to detect ASD. Further education on detecting ASD is needed to optimally equip these doctors for this task. Most research aims to increase the early detection of ASD in children aged 0-3 years and shows positive results. However, there is a lack of research on educational interventions to detect ASD in children aged 4-6 years by municipal maternal and child health physicians. Aim: The aim of this study is to explore the effect of the online educational intervention: Detection of ASD in children aged 4-6 years for municipal maternal and child health physicians. This educational intervention is developed within The Reach-Aut Academic Centre for Autism; Transitions in education, and will be available throughout The Netherlands. Methods: Ninety-two participants will follow the educational intervention: Detection of ASD in children aged 4-6 years for municipal maternal and child health centre physicians. The educational intervention consists of three, one and a half hour sessions, which are offered through an online interactive classroom. The focus and content of the course has been developed in collaboration with three groups of stakeholders; autism scientists, clinical practitioners (municipal maternal and child health doctors and ASD experts) and parents of children with ASD. The primary outcome measure is knowledge about ASD: signals, early detection, communication with parents and referrals. The secondary outcome measures are the number of ASD related referrals, the attitude towards the mentally ill (CAMI), perceived competency about ASD knowledge and detection skills, and satisfaction about the educational intervention. Results and Conclusion: The study started in January 2016 and data collection will end mid 2017.

Keywords: ASD, child, detection, educational intervention, physicians

Procedia PDF Downloads 293
452 Developing the Principal Change Leadership Non-Technical Competencies Scale: An Exploratory Factor Analysis

Authors: Tai Mei Kin, Omar Abdull Kareem

Abstract:

In light of globalization, educational reform has become a top priority for many countries. However, the task of leading change effectively requires a multidimensional set of competencies. Over the past two decades, technical competencies of principal change leadership have been extensively analysed and discussed. Comparatively, little research has been conducted in Malaysian education context on non-technical competencies or popularly known as emotional intelligence, which is equally crucial for the success of change. This article provides a validation of the Principal Change Leadership Non-Technical Competencies (PCLnTC) Scale, a tool that practitioners can easily use to assess school principals’ level of change leadership non-technical competencies that facilitate change and maximize change effectiveness. The overall coherence of the PCLnTC model was constructed by incorporating three theories: a)the change leadership theory whereby leading change is the fundamental role of a leader; b)competency theory in which leadership can be taught and learned; and c)the concept of emotional intelligence whereby it can be developed, fostered and taught. An exploratory factor analysis (EFA) was used to determine the underlying factor structure of PCLnTC model. Before conducting EFA, five important pilot test approaches were conducted to ensure the validity and reliability of the instrument: a)reviewed by academic colleagues; b)verification and comments from panel; c)evaluation on questionnaire format, syntax, design, and completion time; d)evaluation of item clarity; and e)assessment of internal consistency reliability. A total of 335 teachers from 12 High Performing Secondary School in Malaysia completed the survey. The PCLnTCS with six points Liker-type scale were subjected to Principal Components Analysis. The analysis yielded a three-factor solution namely, a)Interpersonal Sensitivity; b)Flexibility; and c)Motivation, explaining a total 74.326 per cent of the variance. Based on the results, implications for instrument revisions are discussed and specifications for future confirmatory factor analysis are delineated.

Keywords: exploratory factor analysis, principal change leadership non-technical competencies (PCLnTC), interpersonal sensitivity, flexibility, motivation

Procedia PDF Downloads 425
451 Simulation of Scaled Model of Tall Multistory Structure: Raft Foundation for Experimental and Numerical Dynamic Studies

Authors: Omar Qaftan

Abstract:

Earthquakes can cause tremendous loss of human life and can result in severe damage to a several of civil engineering structures especially the tall buildings. The response of a multistory structure subjected to earthquake loading is a complex task, and it requires to be studied by physical and numerical modelling. For many circumstances, the scale models on shaking table may be a more economical option than the similar full-scale tests. A shaking table apparatus is a powerful tool that offers a possibility of understanding the actual behaviour of structural systems under earthquake loading. It is required to use a set of scaling relations to predict the behaviour of the full-scale structure. Selecting the scale factors is the most important steps in the simulation of the prototype into the scaled model. In this paper, the principles of scaling modelling procedure are explained in details, and the simulation of scaled multi-storey concrete structure for dynamic studies is investigated. A procedure for a complete dynamic simulation analysis is investigated experimentally and numerically with a scale factor of 1/50. The frequency domain accounting and lateral displacement for both numerical and experimental scaled models are determined. The procedure allows accounting for the actual dynamic behave of actual size porotype structure and scaled model. The procedure is adapted to determine the effects of the tall multi-storey structure on a raft foundation. Four generated accelerograms were used as inputs for the time history motions which are in complying with EC8. The output results of experimental works expressed regarding displacements and accelerations are compared with those obtained from a conventional fixed-base numerical model. Four-time history was applied in both experimental and numerical models, and they concluded that the experimental has an acceptable output accuracy in compare with the numerical model output. Therefore this modelling methodology is valid and qualified for different shaking table experiments tests.

Keywords: structure, raft, soil, interaction

Procedia PDF Downloads 136
450 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data

Authors: Michelangelo Sofo, Giuseppe Labianca

Abstract:

In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.

Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm

Procedia PDF Downloads 23
449 An Eco-Systemic Typology of Fashion Resale Business Models in Denmark

Authors: Mette Dalgaard Nielsen

Abstract:

The paper serves the purpose of providing an eco-systemic typology of fashion resale business models in Denmark while pointing to possibilities to learn from its wisdom during a time when a fundamental break with the dominant linear fashion paradigm has become inevitable. As we transgress planetary boundaries and can no longer continue the unsustainable path of over-exploiting the Earth’s resources, the global fashion industry faces a tremendous need for change. One of the preferred answers to the fashion industry’s sustainability crises lies in the circular economy, which aims to maximize the utilization of resources by keeping garments in use for longer. Thus, in the context of fashion, resale business models that allow pre-owned garments to change hands with the purpose of being reused in continuous cycles are considered to be among the most efficient forms of circularity. Methodologies: The paper is based on empirical data from an ongoing project and a series of qualitative pilot studies that have been conducted on the Danish resale market over a 2-year time period from Fall 2021 to Fall 2023. The methodological framework is comprised of (n) ethnography and fieldwork in selected resale environments, as well as semi-structured interviews and a workshop with eight business partners from the Danish fashion and textiles industry. By focusing on the real-world circulation of pre-owned garments, which is enabled by the identified resale business models, the research lets go of simplistic hypotheses to the benefit of dynamic, vibrant and non-linear processes. As such, the paper contributes to the emerging research field of circular economy and fashion, which finds itself in a critical need to move from non-verified concepts and theories to empirical evidence. Findings: Based on the empirical data and anchored in the business partners, the paper analyses and presents five distinct resale business models with different product, service and design characteristics. These are 1) branded resale, 2) trade-in resale, 3) peer-2-peer resale, 4) resale boutiques and consignment shops and 5) resale shelf/square meter stores and flea markets. Together, the five business models represent a plurality of resale-promoting business model design elements that have been found to contribute to the circulation of pre-owned garments in various ways for different garments, users and businesses in Denmark. Hence, the provided typology points to the necessity of prioritizing several rather than single resale business model designs, services and initiatives for the resale market to help reconfigure the linear fashion model and create a circular-ish future. Conclusions: The article represents a twofold research ambition by 1) presenting an original, up-to-date eco-systemic typology of resale business models in Denmark and 2) using the typology and its eco-systemic traits as a tool to understand different business model design elements and possibilities to help fashion grow out of its linear growth model. By basing the typology on eco-systemic mechanisms and actual exemplars of resale business models, it becomes possible to envision the contours of a genuine alternative to business as usual that ultimately helps bend the linear fashion model towards circularity.

Keywords: circular business models, circular economy, fashion, resale, strategic design, sustainability

Procedia PDF Downloads 59
448 Memory Retrieval and Implicit Prosody during Reading: Anaphora Resolution by L1 and L2 Speakers of English

Authors: Duong Thuy Nguyen, Giulia Bencini

Abstract:

The present study examined structural and prosodic factors on the computation of antecedent-reflexive relationships and sentence comprehension in native English (L1) and Vietnamese-English bilinguals (L2). Participants read sentences presented on the computer screen in one of three presentation formats aimed at manipulating prosodic parsing: word-by-word (RSVP), phrase-segment (self-paced), or whole-sentence (self-paced), then completed a grammaticality rating and a comprehension task (following Pratt & Fernandez, 2016). The design crossed three factors: syntactic structure (simple; complex), grammaticality (target-match; target-mismatch) and presentation format. An example item is provided in (1): (1) The actress that (Mary/John) interviewed at the awards ceremony (about two years ago/organized outside the theater) described (herself/himself) as an extreme workaholic). Results showed that overall, both L1 and L2 speakers made use of a good-enough processing strategy at the expense of more detailed syntactic analyses. L1 and L2 speakers’ comprehension and grammaticality judgements were negatively affected by the most prosodically disrupting condition (word-by-word). However, the two groups demonstrated differences in their performance in the other two reading conditions. For L1 speakers, the whole-sentence and the phrase-segment formats were both facilitative in the grammaticality rating and comprehension tasks; for L2, compared with the whole-sentence condition, the phrase-segment paradigm did not significantly improve accuracy or comprehension. These findings are consistent with the findings of Pratt & Fernandez (2016), who found a similar pattern of results in the processing of subject-verb agreement relations using the same experimental paradigm and prosodic manipulation with English L1 and L2 English-Spanish speakers. The results provide further support for a Good-Enough cue model of sentence processing that integrates cue-based retrieval and implicit prosodic parsing (Pratt & Fernandez, 2016) and highlights similarities and differences between L1 and L2 sentence processing and comprehension.

Keywords: anaphora resolution, bilingualism, implicit prosody, sentence processing

Procedia PDF Downloads 152
447 Ecological and Cartographic Study of the Cork OAK of the Forest of Mahouna, North-Eastern of Algeria

Authors: Amina Beldjazia, Djamel Alatou, Khaled Missaoui

Abstract:

The forest of Mahouna is a part of the mountain range of the Tell Atlas in the northeast of Algeria. It is characterized by a significant biodiversity. The management of this resource requires thorough the understanding of the current state of the vegetation (inventories), degradation factors and ongoing monitoring of the various long-term ecological changes. Digital mapping is a very effective way to in-depth knowledge of natural resources. The realization of a vegetation map based on satellite images, aerial photographs and the use of geographic information system (GIS), shows large values results of the vegetation of the massif in the scientific view point (the development of a database of the different formations that exist on the site, ecological conditions) and economic (GIS facilitate our task of managing the various resources and diversity of the forest). The methodology is divided into three stages: the first involves an analysis of climate data (1988 to 2013); the second is to conduct field surveys (soil and phytoecological) during the months of June and July 2013 (10 readings), the third is based on the development of different themes and synthetic cards by software of GIS (ENVI 4.6 and 10 ARCMAP). The results show: cork oak covers an area of 1147 ha. Depending on the environmental conditions, it rests on sandstone and individualizes between 3 layers of vegetation from thermo-mediterranean (the North East part with 40ha), meso-Mediterranean (1061 ha) and finally the supra-Mediterranean (46ha ). The map shows the current actual state of the cork oak forest massif of Mahouna, it is an older forest (>150 years) where regeneration is absent because of several factors (fires, overgrazing, leaching, erosion, etc.). The cork oak is in the form of dense forest with Laburnum and heather as the dominant species. It may also present in open forest dominated by scrub species: Daphne gniduim, Erica arborea, Calycotome spinosa, Phillyrea angustifolia, Lavandula stoechas, Cistus salvifolius.

Keywords: biodiversity, environmental, Mahouna, Cork oak

Procedia PDF Downloads 443
446 Urban Ethical Fashion Networks of Design, Production and Retail in Taiwan

Authors: WenYing Claire Shih, Konstantinos Agrafiotis

Abstract:

The circular economy has become one of the seven fundamental pillars of Taiwan’s economic development, as this is promulgated by the government. The model of the circular economy, with its fundamental premise of waste elimination, can transform the textile and clothing sectors from major pollutant industries to a much cleaner alternative for a better quality of all citizens’ lives. In a related vein, the notion of the creative economy and more specifically the fashion industry can prompt similar results in terms of jobs and wealth creation. The combining forces of the circular and creative economies and their beneficial output have resulted in the configuration of ethical urban networks which potentially may lead to sources of competitive advantage. All actors involved in the configuration of this urban ethical fashion network from public authorities to private enterprise can bring about positive changes in the urban setting. Preliminary results through action research show that this configuration is an attainable task in terms of circularity by reducing fabric waste produced from local textile mills and through innovative methods of design, production and retail around urban spaces where the network has managed to generate a stream of jobs and financial revenues for all participants. The municipal authorities as the facilitating platform have been of paramount importance in this public-private partnership. In the explorative pilot study conducted about a network of production, consumption in terms of circularity of fashion products, we have experienced a positive disposition. As the network will be fully functional by attracting more participant firms from the textile and clothing sectors, it can be beneficial to Taiwan’s soft power in the region and simultaneously elevate citizens’ awareness on circular methods of fashion production, consumption and disposal which can also lead to the betterment of urban lifestyle and may open export horizons for the firms.

Keywords: the circular economy, the creative economy, ethical urban networks, action research

Procedia PDF Downloads 136
445 A Preliminary Kinematic Comparison of Vive and Vicon Systems for the Accurate Tracking of Lumbar Motion

Authors: Yaghoubi N., Moore Z., Van Der Veen S. M., Pidcoe P. E., Thomas J. S., Dexheimer B.

Abstract:

Optoelectronic 3D motion capture systems, such as the Vicon kinematic system, are widely utilized in biomedical research to track joint motion. These systems are considered powerful and accurate measurement tools with <2 mm average error. However, these systems are costly and may be difficult to implement and utilize in a clinical setting. 3D virtual reality (VR) is gaining popularity as an affordable and accessible tool to investigate motor control and perception in a controlled, immersive environment. The HTC Vive VR system includes puck-style trackers that seamlessly integrate into its VR environments. These affordable, wireless, lightweight trackers may be more feasible for clinical kinematic data collection. However, the accuracy of HTC Vive Trackers (3.0), when compared to optoelectronic 3D motion capture systems, remains unclear. In this preliminary study, we compared the HTC Vive Tracker system to a Vicon kinematic system in a simulated lumbar flexion task. A 6-DOF robot arm (SCORBOT ER VII, Eshed Robotec/RoboGroup, Rosh Ha’Ayin, Israel) completed various reaching movements to mimic increasing levels of hip flexion (15°, 30°, 45°). Light reflective markers, along with one HTC Vive Tracker (3.0), were placed on the rigid segment separating the elbow and shoulder of the robot. We compared position measures simultaneously collected from both systems. Our preliminary analysis shows no significant differences between the Vicon motion capture system and the HTC Vive tracker in the Z axis, regardless of hip flexion. In the X axis, we found no significant differences between the two systems at 15 degrees of hip flexion but minimal differences at 30 and 45 degrees, ranging from .047 cm ± .02 SE (p = .03) at 30 degrees hip flexion to .194 cm ± .024 SE (p < .0001) at 45 degrees of hip flexion. In the Y axis, we found a minimal difference for 15 degrees of hip flexion only (.743 cm ± .275 SE; p = .007). This preliminary analysis shows that the HTC Vive Tracker may be an appropriate, affordable option for gross motor motion capture when the Vicon system is not available, such as in clinical settings. Further research is needed to compare these two motion capture systems in different body poses and for different body segments.

Keywords: lumbar, vivetracker, viconsystem, 3dmotion, ROM

Procedia PDF Downloads 101
444 An Efficient Hardware/Software Workflow for Multi-Cores Simulink Applications

Authors: Asma Rebaya, Kaouther Gasmi, Imen Amari, Salem Hasnaoui

Abstract:

Over these last years, applications such as telecommunications, signal processing, digital communication with advanced features (Multi-antenna, equalization..) witness a rapid evaluation accompanied with an increase of user exigencies in terms of latency, the power of computation… To satisfy these requirements, the use of hardware/software systems is a common solution; where hardware is composed of multi-cores and software is represented by models of computation, synchronous data flow (SDF) graph for instance. Otherwise, the most of the embedded system designers utilize Simulink for modeling. The issue is how to simplify the c code generation, for a multi-cores platform, of an application modeled by Simulink. To overcome this problem, we propose a workflow allowing an automatic transformation from the Simulink model to the SDF graph and providing an efficient schedule permitting to optimize the number of cores and to minimize latency. This workflow goes from a Simulink application and a hardware architecture described by IP.XACT language. Based on the synchronous and hierarchical behavior of both models, the Simulink block diagram is automatically transformed into an SDF graph. Once this process is successfully achieved, the scheduler calculates the optimal cores’ number needful by minimizing the maximum density of the whole application. Then, a core is chosen to execute a specific graph task in a specific order and, subsequently, a compatible C code is generated. In order to perform this proposal, we extend Preesm, a rapid prototyping tool, to take the Simulink model as entry input and to support the optimal schedule. Afterward, we compared our results to this tool results, using a simple illustrative application. The comparison shows that our results strictly dominate the Preesm results in terms of number of cores and latency. In fact, if Preesm needs m processors and latency L, our workflow need processors and latency L'< L.

Keywords: hardware/software system, latency, modeling, multi-cores platform, scheduler, SDF graph, Simulink model, workflow

Procedia PDF Downloads 269
443 Apple in the Big Tech Oligopoly: An Analysis of Disruptive Innovation Trends and Their Influence on the Capacity of Conserving a Positive Social Impact as Primary Purpose

Authors: E. Loffi Borghese

Abstract:

In this comprehensive study, we delve into the intricate dynamics of the big tech oligopoly, focusing particularly on Apple as a case study. The core objective is to scrutinize the evolving relationship between a firm's commitment to positive social impact as its primary purpose and its resilience in the face of disruptive innovations within the big tech market. Our exploration begins with a theoretical framework, emphasizing the significance of distinguishing between corporate social responsibility and social impact as a primary purpose. Drawing on insights from Drumwright and Bartkus and Glassman, we underscore the transformative potential when a firm aligns its core business with a social mission, transcending mere side activities. Examining successful firms, such as Apple, we adopt Sinek's perspective on inspirational leadership and the "golden circle." This framework sheds light on why some organizations, like Apple, succeed in making positive social impact their primary purpose. Apple's early-stage life cycle is dissected, revealing a profound commitment to challenging the status quo and promoting simpler alternatives that resonate with its users' lives. The study then navigates through industry life cycles, drawing on Klepper's stages and Christensen's disruptive innovations. Apple's dominance in the big tech oligopoly is contrasted with companies like Harley Davidson and Polaroid, illustrating the consequences of failing to adapt to disruptive innovations. The data and methods employed encompass a qualitative approach, leveraging sources like ECB, Forbes, World in Data, and scientific articles. A secondary data analysis probes Apple's market evolution within the big tech oligopoly, emphasizing the shifts in market context and innovation trends that demand strategic adaptations. The subsequent sections scrutinize Apple's present innovation strategies, highlighting its diversified product portfolio and intensified focus on big data. We examine the implications of these shifts on Apple's capacity to maintain positive social impact as its primary purpose, pondering potential consequences on its brand perception. The study culminates in a reflection on the broader implications of the big tech oligopoly's dominance. It contemplates the diminishing competitiveness in the market and the potential sidelining of positive social impact as a competitive advantage. The expansion of tech firms into diverse sectors raises concerns about negative societal impacts, prompting a call for increased regulatory attention and awareness. In conclusion, this research serves as a catalyst for heightened awareness and discussion on the intricate interplay between firms' social impact goals, disruptive innovations, and the broader societal implications within the evolving landscape of the big tech oligopoly. Despite limitations, this study aims to stimulate further research, urging a conscious and responsible approach to shaping the future economic system.

Keywords: innovation trends, market dynamics, social impact, tech oligopoly

Procedia PDF Downloads 74
442 An Argument for Agile, Lean, and Hybrid Project Management in Museum Conservation Practice: A Qualitative Evaluation of the Morris Collection Conservation Project at the Sainsbury Centre for Visual Arts

Authors: Maria Ledinskaya

Abstract:

This paper is part case study and part literature review. It seeks to introduce Agile, Lean, and Hybrid project management concepts from business, software development, and manufacturing fields to museum conservation by looking at their practical application on a recent conservation project at the Sainsbury Centre for Visual Arts. The author outlines the advantages of leaner and more agile conservation practices in today’s faster, less certain, and more budget-conscious museum climate where traditional project structures are no longer as relevant or effective. The Morris Collection Conservation Project was carried out in 2019-2021 in Norwich, UK, and concerned the remedial conservation of around 150 Abstract Constructivist artworks bequeathed to the Sainsbury Centre by private collectors Michael and Joyce Morris. It was a medium-sized conservation project of moderate complexity, planned and delivered in an environment with multiple known unknowns – unresearched collection, unknown conditions and materials, unconfirmed budget. The project was later impacted by the COVID-19 pandemic, introducing indeterminate lockdowns, budget cuts, staff changes, and the need to accommodate social distancing and remote communications. The author, then a staff conservator at the Sainsbury Centre who acted as project manager on the Morris Project, presents an incremental, iterative, and value-based approach to managing a conservation project in an uncertain environment. The paper examines the project from the point of view of Traditional, Agile, Lean, and Hybrid project management. The author argues that most academic writing on project management in conservation has focussed on a Traditional plan-driven approach – also known as Waterfall project management – which has significant drawbacks in today’s museum environment due to its over-reliance on prediction-based planning and its low tolerance to change. In the last 20 years, alternative Agile, Lean and Hybrid approaches to project management have been widely adopted in software development, manufacturing, and other industries, although their recognition in the museum sector has been slow. Using examples from the Morris Project, the author introduces key principles and tools of Agile, Lean, and Hybrid project management and presents a series of arguments on the effectiveness of these alternative methodologies in museum conservation, including the ethical and practical challenges to their implementation. These project management approaches are discussed in the context of consequentialist, relativist, and utilitarian developments in contemporary conservation ethics. Although not intentionally planned as such, the Morris Project had a number of Agile and Lean features which were instrumental to its successful delivery. These key features are identified as distributed decision-making, a co-located cross-disciplinary team, servant leadership, focus on value-added work, flexible planning done in shorter sprint cycles, light documentation, and emphasis on reducing procedural, financial, and logistical waste. Overall, the author’s findings point in favour of a hybrid model, which combines traditional and alternative project processes and tools to suit the specific needs of the project.

Keywords: agile project management, conservation, hybrid project management, lean project management, waterfall project management

Procedia PDF Downloads 71
441 Demonstration of Land Use Changes Simulation Using Urban Climate Model

Authors: Barbara Vojvodikova, Katerina Jupova, Iva Ticha

Abstract:

Cities in their historical evolution have always adapted their internal structure to the needs of society (for example protective city walls during classicism era lost their defense function, became unnecessary, were demolished and gave space for new features such as roads, museums or parks). Today it is necessary to modify the internal structure of the city in order to minimize the impact of climate changes on the environment of the population. This article discusses the results of the Urban Climate model owned by VITO, which was carried out as part of a project from the European Union's Horizon grant agreement No 730004 Pan-European Urban Climate Services Climate-Fit city. The use of the model was aimed at changes in land use and land cover in cities related to urban heat islands (UHI). The task of the application was to evaluate possible land use change scenarios in connection with city requirements and ideas. Two pilot areas in the Czech Republic were selected. One is Ostrava and the other Hodonín. The paper provides a demonstration of the application of the model for various possible future development scenarios. It contains an assessment of the suitability or inappropriateness of scenarios of future development depending on the temperature increase. Cities that are preparing to reconstruct the public space are interested in eliminating proposals that would lead to an increase in temperature stress as early as in the assignment phase. If they have evaluation on the unsuitability of some type of design, they can limit it into the proposal phases. Therefore, especially in the application of models on Local level - in 1 m spatial resolution, it was necessary to show which type of proposals would create a significant temperature island in its implementation. Such a type of proposal is considered unsuitable. The model shows that the building itself can create a shady place and thus contribute to the reduction of the UHI. If it sensitively approaches the protection of existing greenery, this new construction may not pose a significant problem. More massive interventions leading to the reduction of existing greenery create a new heat island space.

Keywords: climate model, heat islands, Hodonin, land use changes, Ostrava

Procedia PDF Downloads 143
440 Effects of Partial Sleep Deprivation on Prefrontal Cognitive Functions in Adolescents

Authors: Nurcihan Kiris

Abstract:

Restricted sleep is common in young adults and adolescents. The results of a few objective studies of sleep deprivation on cognitive performance were not clarified. In particular, the effect of sleep deprivation on the cognitive functions associated with frontal lobe such as attention, executive functions, working memory is not well known. The aim of this study is to investigate the effect of partial sleep deprivation experimentally in adolescents on the cognitive tasks of frontal lobe including working memory, strategic thinking, simple attention, continuous attention, executive functions, and cognitive flexibility. Subjects of the study were recruited from voluntary students of Cukurova University. Eighteen adolescents underwent four consecutive nights of monitored sleep restriction (6–6.5 hr/night) and four nights of sleep extension (10–10.5 hr/night), in counterbalanced order, and separated by a washout period. Following each sleep period, cognitive performance was assessed, at a fixed morning time, using a computerized neuropsychological battery based on frontal lobe functions task, a timed test providing both accuracy and reaction time outcome measures. Only the spatial working memory performance of cognitive tasks was found to be statistically lower in a restricted sleep condition than the extended sleep condition. On the other hand, there was no significant difference in the performance of cognitive tasks evaluating simple attention, constant attention, executive functions, and cognitive flexibility. It is thought that especially the spatial working memory and strategic thinking skills of adolescents may be susceptible to sleep deprivation. On the other hand, adolescents are predicted to be optimally successful in ideal sleep conditions, especially in the circumstances requiring for the short term storage of visual information, processing of stored information, and strategic thinking. The findings of this study may also be associated with possible negative functional effects on the processing of academic social and emotional inputs in adolescents for partial sleep deprivation. Acknowledgment: This research was supported by Cukurova University Scientific Research Projects Unit.

Keywords: attention, cognitive functions, sleep deprivation, working memory

Procedia PDF Downloads 156
439 In-Situ Sludge Minimization Using Integrated Moving Bed Biofilm Reactor for Industrial Wastewater Treatment

Authors: Vijay Sodhi, Charanjit Singh, Neelam Sodhi, Puneet P. S. Cheema, Reena Sharma, Mithilesh K. Jha

Abstract:

The management and secure disposal of the biosludge generated from widely commercialized conventional activated sludge (CAS) treatments become a potential environmental issue. Thus, a sustainable technological upgradation to the CAS for sludge yield minimization has recently been gained serious attention of the scientific community. A number of recently reported studies effectively addressed the remedial technological advancements that in monopoly limited to the municipal wastewater. Moreover, the critical review of the literature signifies side-stream sludge minimization as a complex task to maintain. In this work, therefore, a hybrid moving bed biofilm reactor (MBBR) configuration (named as AMOMOX process) for in-situ minimization of the excess biosludge generated from high organic strength tannery wastewater has been demonstrated. The AMOMOX collectively stands for anoxic MBBR (as AM), aerobic MBBR (OM) and an oxic CAS (OX). The AMOMOX configuration involved a combined arrangement of an anoxic MBBR and oxic MBBR coupled with the aerobic CAS. The AMOMOX system was run in parallel with an identical CAS reactor. Both system configurations were fed with same influent to judge the real-time operational changes. For the AMOMOX process, the strict maintenance of operational strategies resulted about 95% removal of NH4-N and SCOD from tannery wastewater. Here, the nourishment of filamentous microbiota and purposeful promotion of cell-lysis effectively sustained sludge yield (Yobs) lowering upto 0.51 kgVSS/kgCOD. As a result, the volatile sludge scarcity apparent in the AMOMOX system succeeded upto 47% reduction of the excess biosludge. The corroborated was further supported by FE-SEM imaging and thermogravimetric analysis. However, the detection of microbial strains habitat underlying extended SRT (23-26 days) of the AMOMOX system would be the matter of further research.

Keywords: tannery wastewater, moving bed biofilm reactor, sludhe yield, sludge minimization, solids retention time

Procedia PDF Downloads 71
438 Impact of COVID-19 on Antenatal Care Provision at Public Hospitals in Ethiopia: A Mixed Method Study

Authors: Zemenu Yohannes

Abstract:

Introduction: The pandemic overstretched the weak health systems in developing countries, including Ethiopia. This study aims to assess and explore the effect of COVID-19 on antenatal care (ANC) provision. Methods: A concurrent mixed methods study was applied. An interrupted time series design was applied for the quantitative study, and in-depth interviews were implemented for the qualitative research to explore maternity care providers' perceptions of ANC provision during COVID-19. We used routine monthly collected data from the health management information system (HMIS) in fifteen hospitals in the Sidama region, Ethiopia, from March 2019 to February 2020 (12 months) before COVID-19 and from March to August 2020 (6 months) during COVID-19. We imported data into STATA V.17 for analysis. ANC provision's mean monthly incidence rate ratio (IRR) was calculated using Poisson regression with a 95% confidence interval. The qualitative data were analysed using thematic analysis. Findings from quantitative and qualitative elements were integrated with a contiguous approach. Results: Our findings indicate the rate of ANC provision significantly decreased in the first six months of COVID-19. This study has three identified main themes: barriers to ANC provision, inadequate COVID-19 prevention approach, and delay in providing ANC. Conclusion and recommendation: Based on our findings, the pandemic affected ANC provision in the study area. The health bureau and stakeholders should take a novel and sustainable approach to prevent future pandemics. The health bureau and hospital administrators should establish a task force that relies on financial self-reliance to close gaps in future pandemics of medical supply shortages. Pregnant women should receive their care promptly from maternity care providers. In order to foster contact and avoid discrimination the future pandemics, hospital administrators should set up a platform for community members and maternity care providers.

Keywords: ANC provision, COVID-19, mixed methods study, Ethiopia

Procedia PDF Downloads 74
437 Investigating the Application of Composting for Phosphorous Recovery from Alum Precipitated and Ferric Precipitated Sludge

Authors: Saba Vahedi, Qiuyan Yuan

Abstract:

A vast majority of small municipalities and First Nations communities in Manitoba operate facultative or aerated lagoons for wastewater treatment, and most of them use Ferric Chloride (FeCl3) or alum (usually in the form of Al2(SO4)3 ·18H2O) as coagulant for phosphorous removal. The insoluble particles that form during the coagulation process result in a massive volume of sludge which is typically left in the lagoons. Therefore, phosphorous, which is a valuable nutrient, is lost in the process. In this project, the complete recovery of phosphorous from the sludge that is produced in the process of phosphorous removal from wastewater lagoons by using a controlled composting process is investigated. Objective The main objective of this project is to compost alum precipitated sludge that is produced in the process of phosphorous removal in wastewater treatment lagoons in Manitoba. The ultimate goal is to have a product that will meet the characteristics of Class A biosolids in Canada. A number of parameters, including the bioavailability of nutrients in the composted sludge and the toxicity of the sludge, will be evaluated Investigating the bioavailability of phosphorous in the final compost product. The compost will be used as a source of P compared to a commercial fertilizer (monoammonium phosphate MAP) Experimental setup Three different batches of composts piles have been run using the Alum sludge and Ferric sludge. The alum phosphate sludge was collected from an innovative phosphorous removal system at the RM of Taché . The collected sludge was sent to ALS laboratory to analyze the C/N ratio, TP, TN, TC, TAl, moisture contents, pH, and metals concentrations. Wood chips as the bulking agent were collected at the RM of Taché landfill The sludge in the three piles were mixed with 3x dry woodchips. The mixture was turned every week manually. The temperature, the moisture content, and pH were monitored twice a week. The temperature of the mixtures was remained above 55 °C for two weeks. Each pile was kept for ten weeks to get mature. The final products have been applied to two different plants to investigate the bioavailability of P in the compost product as well as the toxicity of the product. The two types of plants were selected based on their sensitivity, growth time, and their compatibility with the Manitoba climate, which are Canola, and switchgrass. The pots are weighed and watered every day to replenish moisture lost by evapotranspiration. A control experiment is also conducted by using topsoil soil and chemical fertilizers (MAP). The experiment will be carried out in a growth room maintained at a day/night temperature regime of 25/15°C, a relative humidity of 60%, and a corresponding photoperiod of 16 h. A total of three cropping (seeding to harvest) cycles need be completed, with each cycle at 50 d in duration. Harvested biomass must be weighed and oven-dried for 72 h at 60°C. The first cycle of growth Canola and Switchgrasses in the alum sludge compost, harvested at the day 50, oven dried, chopped into bits and fine ground in a mill grinder (< 0.2mm), and digested using the wet oxidation method in which plant tissue samples were digested with H2SO4 (99.7%) and H2O2 (30%) in an acid block digester. The digested plant samples need to be analyzed to measure the amount of total phosphorus.

Keywords: wastewater treatment, phosphorus removal, composting alum sludge, bioavailibility of pohosphorus

Procedia PDF Downloads 71
436 An Advanced Automated Brain Tumor Diagnostics Approach

Authors: Berkan Ural, Arif Eser, Sinan Apaydin

Abstract:

Medical image processing is generally become a challenging task nowadays. Indeed, processing of brain MRI images is one of the difficult parts of this area. This study proposes a hybrid well-defined approach which is consisted from tumor detection, extraction and analyzing steps. This approach is mainly consisted from a computer aided diagnostics system for identifying and detecting the tumor formation in any region of the brain and this system is commonly used for early prediction of brain tumor using advanced image processing and probabilistic neural network methods, respectively. For this approach, generally, some advanced noise removal functions, image processing methods such as automatic segmentation and morphological operations are used to detect the brain tumor boundaries and to obtain the important feature parameters of the tumor region. All stages of the approach are done specifically with using MATLAB software. Generally, for this approach, firstly tumor is successfully detected and the tumor area is contoured with a specific colored circle by the computer aided diagnostics program. Then, the tumor is segmented and some morphological processes are achieved to increase the visibility of the tumor area. Moreover, while this process continues, the tumor area and important shape based features are also calculated. Finally, with using the probabilistic neural network method and with using some advanced classification steps, tumor area and the type of the tumor are clearly obtained. Also, the future aim of this study is to detect the severity of lesions through classes of brain tumor which is achieved through advanced multi classification and neural network stages and creating a user friendly environment using GUI in MATLAB. In the experimental part of the study, generally, 100 images are used to train the diagnostics system and 100 out of sample images are also used to test and to check the whole results. The preliminary results demonstrate the high classification accuracy for the neural network structure. Finally, according to the results, this situation also motivates us to extend this framework to detect and localize the tumors in the other organs.

Keywords: image processing algorithms, magnetic resonance imaging, neural network, pattern recognition

Procedia PDF Downloads 418
435 Improving Cleanability by Changing Fish Processing Equipment Design

Authors: Lars A. L. Giske, Ola J. Mork, Emil Bjoerlykhaug

Abstract:

The design of fish processing equipment greatly impacts how easy the cleaning process for the equipment is. This is a critical issue in fish processing, as cleaning of fish processing equipment is a task that is both costly and time consuming, in addition to being very important with regards to product quality. Even more, poorly cleaned equipment could in the worst case lead to contaminated product from which consumers could get ill. This paper will elucidate how equipment design changes could improve the work for the cleaners and saving money for the fish processing facilities by looking at a case for product design improvements. The design of fish processing equipment largely determines how easy it is to clean. “Design for cleaning” is the new hype in the industry and equipment where the ease of cleaning is prioritized gets a competitive advantage over equipment in which design for cleaning has not been prioritized. Design for cleaning is an important research area for equipment manufacturers. SeaSide AS is doing continuously improvements in the design of their products in order to gain a competitive advantage. The focus in this paper will be conveyors for internal logistic and a product called the “electro stunner” will be studied with regards to “Design for cleaning”. Often together with SeaSide’s customers, ideas for new products or product improvements are sketched out, 3D-modelled, discussed, revised, built and delivered. Feedback from the customers is taken into consideration, and the product design is revised once again. This loop was repeated multiple times, and led to new product designs. The new designs sometimes also cause the manufacturing processes to change (as in going from bolted to welded connections). Customers report back that the concrete changes applied to products by SeaSide has resulted in overall more easily cleaned equipment. These changes include, but are not limited to; welded connections (opposed to bolted connections), gaps between contact faces, opening up structures to allow cleaning “inside” equipment, and generally avoiding areas in which humidity and water may gather and build up. This is important, as there will always be bacteria in the water which will grow if the area never dries up. The work of creating more cleanable design is still ongoing, and will “never” be finished as new designs and new equipment will have their own challenges.

Keywords: cleaning, design, equipment, fish processing, innovation

Procedia PDF Downloads 237
434 H2/He and H2O/He Separation Experiments with Zeolite Membranes for Nuclear Fusion Applications

Authors: Rodrigo Antunes, Olga Borisevich, David Demange

Abstract:

In future nuclear fusion reactors, tritium self-sufficiency will be ensured by tritium (3H) production via reactions between the fusion neutrons and lithium. To favor tritium breeding, a neutron multiplier must also be used. Both tritium breeder and neutron multiplier will be placed in the so-called Breeding Blanket (BB). For the European Helium-Cooled Pebble Bed (HCPB) BB concept, the tritium production and neutron multiplication will be ensured by neutron bombardment of Li4SiO4 and Be pebbles, respectively. The produced tritium is extracted from the pebbles by purging them with large flows of He (~ 104 Nm3h-1), doped with small amounts of H2 (~ 0.1 vol%) to promote tritium extraction via isotopic exchange (producing HT). Due to the presence of oxygen in the pebbles, production of tritiated water is unavoidable. Therefore, the purging gas downstream of the BB will be composed by Q2/Q2O/He (Q = 1H, 2H, 3H), with Q2/Q2O down to ppm levels, which must be further processed for tritium recovery. A two-stage continuous approach, where zeolite membranes (ZMs) are followed by a catalytic membrane reactor (CMR), has been recently proposed to fulfil this task. The tritium recovery from Q2/Q2O/He is ensured by the CMR, that requires a reduction of the gas flow coming from the BB and a pre-concentration of Q2 and Q2O to be efficient. For this reason, and to keep this stage with reasonable dimensions, ZMs are required upfront to reduce as much as possible the He flows and concentrate the Q2/Q2O species. Therefore, experimental activities have been carried out at the Tritium Laboratory Karlsruhe (TLK) to test the separation performances of different zeolite membranes for H2/H2O/He. First experiments have been performed with binary mixtures of H2/He and H2O/He with commercial MFI-ZSM5 and NaA zeolite-type membranes. Only the MFI-ZSM5 demonstrated selectivity towards H2, with a separation factor around 1.5, and H2 permeances around 0.72 µmolm-2s-1Pa-1, rather independent for feed concentrations in the range 0.1 vol%-10 vol% H2/He. The experiments with H2O/He have demonstrated that the separation factor towards H2O is highly dependent on the feed concentration and temperature. For instance, at 0.2 vol% H2O/He the separation factor with NaA is below 2 and around 1000 at 5 vol% H2O/He, at 30°C. Overall, both membranes demonstrated complementary results at equivalent temperatures. In fact, at low feed concentrations ( ≤ 1 vol% H2O/He) MFI-ZSM5 separates better than NaA, whereas the latter has higher separation factors for higher inlet water content ( ≥ 5 vol% H2O/He). In this contribution, the results obtained with both MFI-ZSM5 and NaA membranes for H2/He and H2O/H2 mixtures at different concentrations and temperatures are compared and discussed.

Keywords: nuclear fusion, gas separation, tritium processes, zeolite membranes

Procedia PDF Downloads 288
433 New Insights for Soft Skills Development in Vietnamese Business Schools: Defining Essential Soft Skills for Maximizing Graduates’ Career Success

Authors: Hang T. T. Truong, Ronald S. Laura, Kylie Shaw

Abstract:

Within Vietnam's system of higher education, its schools of business play a vital role in supporting the country’s economic objectives. However, the crucial contribution of soft skills for maximal success within the business sector has to date not been adequately recognized by its business schools. This being so, the development of the business school curriculum in Vietnam has not been able to 'catch up', so to say, with the burgeoning need of students for a comprehensive soft skills program designed to meet the national and global business objectives of their potential employers. The burden of the present paper is first to reveal the results of our survey in Vietnam which make explicit the extent to which major Vietnamese industrial employers’ value the potential role that soft skill competencies can play in maximizing business success. Our final task will be to determine which soft skills employers discern as best serving to maximize the economic interests of Vietnam within the global marketplace. Semi-structured telephone interviews have been conducted with the 15 representative Head Employers of Vietnam's reputedly largest and most successful of the diverse business enterprises across Vietnam. The findings of the study indicate that all respondents highly value the increasing importance of soft skills in business success. Our critical analysis of respondent data reveals that 19 essential soft skills are deemed by employers as integral to business workplace efficacy and should thus be integrated into the formal business curriculum. We are confident that our study represents the first comprehensive and specific survey yet undertaken within the business sector in Vietnam which accesses and analyses the opinions of representative employers from major companies across the country in regard to the growing importance of 19 specific soft skills essential for maximizing overall business success. Our research findings also reveal that the integration into business school curriculums nationwide of the soft skills we have identified is of paramount importance to advance the national and global economic interests of Vietnam.

Keywords: business curriculum, business graduates, employers’ perception, soft skills

Procedia PDF Downloads 320
432 Increased Stability of Rubber-Modified Asphalt Mixtures to Swelling, Expansion and Rebound Effect during Post-Compaction

Authors: Fernando Martinez Soto, Gaetano Di Mino

Abstract:

The application of rubber into bituminous mixtures requires attention and care during mixing and compaction. Rubber modifies the properties because it reacts in the internal structure of bitumen at high temperatures changing the performance of the mixture (interaction process of solvents with binder-rubber aggregate). The main change is the increasing of the viscosity and elasticity of the binder due to the larger sizes of the rubber particles by dry process but, this positive effect is counteracted by short mixing times, compared to wet technology, and due to the transport processes, curing time and post-compaction of the mixtures. Therefore, negative effects as swelling of rubber particles, rebounding effect of the specimens and thermal changes by different expansion of the structure inside the mixtures, can change the mechanical properties of the rubberized blends. Based on the dry technology, different asphalt-rubber binders using devulcanized or natural rubber (truck and bus tread rubber), have served to demonstrate these effects and how to solve them into two dense-gap graded rubber modified asphalt concrete mixes (RUMAC) to enhance the stability, workability and durability of the compacted samples by Superpave gyratory compactor method. This paper specifies the procedures developed in the Department of Civil Engineering of the University of Palermo during September 2016 to March 2017, for characterizing the post-compaction and mix-stability of the one conventional mixture (hot mix asphalt without rubber) and two gap-graded rubberized asphalt mixes according granulometry for rail sub-ballast layers with nominal size of Ø22.4mm of aggregates according European standard. Thus, the main purpose of this laboratory research is the application of ambient ground rubber from scrap tires processed at conventional temperature (20ºC) inside hot bituminous mixtures (160-220ºC) as a substitute for 1.5%, 2% and 3% by weight of the total aggregates (3.2%, 4.2% and, 6.2% respectively by volumetric part of the limestone aggregates of bulk density equal to 2.81g/cm³) considered, not as a part of the asphalt binder. The reference bituminous mixture was designed with 4% of binder and ± 3% of air voids, manufactured for a conventional bitumen B50/70 at 160ºC-145ºC mix-compaction temperatures to guarantee the workability of the mixes. The proportions of rubber proposed are #60-40% for mixtures with 1.5 to 2% of rubber and, #20-80% for mixture with 3% of rubber (as example, a 60% of Ø0.4-2mm and 40% of Ø2-4mm). The temperature of the asphalt cement is between 160-180 ºC for mixing and 145-160 ºC for compaction, according to the optimal values for viscosity using Brookfield viscometer and 'ring and ball' - penetration tests. These crumb rubber particles act as a rubber-aggregate into the mixture, varying sizes between 0.4mm to 2mm in a first fraction, and 2-4mm as second proportion. Ambient ground rubber with a specific gravity of 1.154g/cm³ is used. The rubber is free of loose fabric, wire, and other contaminants. It was found optimal results in real beams and cylindrical specimens with each HMA mixture reducing the swelling effect. Different factors as temperature, particle sizes of rubber, number of cycles and pressures of compaction that affect the interaction process are explained.

Keywords: crumb-rubber, gyratory compactor, rebounding effect, superpave mix-design, swelling, sub-ballast railway

Procedia PDF Downloads 243
431 Combining Multiscale Patterns of Weather and Sea States into a Machine Learning Classifier for Mid-Term Prediction of Extreme Rainfall in North-Western Mediterranean Sea

Authors: Pinel Sebastien, Bourrin François, De Madron Du Rieu Xavier, Ludwig Wolfgang, Arnau Pedro

Abstract:

Heavy precipitation constitutes a major meteorological threat in the western Mediterranean. Research has investigated the relationship between the states of the Mediterranean Sea and the atmosphere with the precipitation for short temporal windows. However, at a larger temporal scale, the precursor signals of heavy rainfall in the sea and atmosphere have drawn little attention. Moreover, despite ongoing improvements in numerical weather prediction, the medium-term forecasting of rainfall events remains a difficult task. Here, we aim to investigate the influence of early-spring environmental parameters on the following autumnal heavy precipitations. Hence, we develop a machine learning model to predict extreme autumnal rainfall with a 6-month lead time over the Spanish Catalan coastal area, based on i) the sea pattern (main current-LPC and Sea Surface Temperature-SST) at the mesoscale scale, ii) 4 European weather teleconnection patterns (NAO, WeMo, SCAND, MO) at synoptic scale, and iii) the hydrological regime of the main local river (Rhône River). The accuracy of the developed model classifier is evaluated via statistical analysis based on classification accuracy, logarithmic and confusion matrix by comparing with rainfall estimates from rain gauges and satellite observations (CHIRPS-2.0). Sensitivity tests are carried out by changing the model configuration, such as sea SST, sea LPC, river regime, and synoptic atmosphere configuration. The sensitivity analysis suggests a negligible influence from the hydrological regime, unlike SST, LPC, and specific teleconnection weather patterns. At last, this study illustrates how public datasets can be integrated into a machine learning model for heavy rainfall prediction and can interest local policies for management purposes.

Keywords: extreme hazards, sensitivity analysis, heavy rainfall, machine learning, sea-atmosphere modeling, precipitation forecasting

Procedia PDF Downloads 135
430 Conjunctive Management of Surface and Groundwater Resources under Uncertainty: A Retrospective Optimization Approach

Authors: Julius M. Ndambuki, Gislar E. Kifanyi, Samuel N. Odai, Charles Gyamfi

Abstract:

Conjunctive management of surface and groundwater resources is a challenging task due to the spatial and temporal variability nature of hydrology as well as hydrogeology of the water storage systems. Surface water-groundwater hydrogeology is highly uncertain; thus it is imperative that this uncertainty is explicitly accounted for, when managing water resources. Various methodologies have been developed and applied by researchers in an attempt to account for the uncertainty. For example, simulation-optimization models are often used for conjunctive water resources management. However, direct application of such an approach in which all realizations are considered at each iteration of the optimization process leads to a very expensive optimization in terms of computational time, particularly when the number of realizations is large. The aim of this paper, therefore, is to introduce and apply an efficient approach referred to as Retrospective Optimization Approximation (ROA) that can be used for optimizing conjunctive use of surface water and groundwater over a multiple hydrogeological model simulations. This work is based on stochastic simulation-optimization framework using a recently emerged technique of sample average approximation (SAA) which is a sampling based method implemented within the Retrospective Optimization Approximation (ROA) approach. The ROA approach solves and evaluates a sequence of generated optimization sub-problems in an increasing number of realizations (sample size). Response matrix technique was used for linking simulation model with optimization procedure. The k-means clustering sampling technique was used to map the realizations. The methodology is demonstrated through the application to a hypothetical example. In the example, the optimization sub-problems generated were solved and analysed using “Active-Set” core optimizer implemented under MATLAB 2014a environment. Through k-means clustering sampling technique, the ROA – Active Set procedure was able to arrive at a (nearly) converged maximum expected total optimal conjunctive water use withdrawal rate within a relatively few number of iterations (6 to 7 iterations). Results indicate that the ROA approach is a promising technique for optimizing conjunctive water use of surface water and groundwater withdrawal rates under hydrogeological uncertainty.

Keywords: conjunctive water management, retrospective optimization approximation approach, sample average approximation, uncertainty

Procedia PDF Downloads 231