Search results for: real anthropometric database
2930 Verification of Space System Dynamics Using the MATLAB Identification Toolbox in Space Qualification Test
Authors: Yuri V. Kim
Abstract:
This article presents a new approach to the Functional Testing of Space Systems (SS). It can be considered as a generic test and used for a wide class of SS that from the point of view of System Dynamics and Control may be described by the ordinary differential equations. Suggested methodology is based on using semi-natural experiment- laboratory stand that doesn’t require complicated, precise and expensive technological control-verification equipment. However, it allows for testing system as a whole totally assembled unit during Assembling, Integration and Testing (AIT) activities, involving system hardware (HW) and software (SW). The test physically activates system input (sensors) and output (actuators) and requires recording their outputs in real time. The data is then inserted in laboratory PC where it is post-experiment processed by Matlab/Simulink Identification Toolbox. It allows for estimating system dynamics in form of estimation of system differential equations by the experimental way and comparing them with expected mathematical model prematurely verified by mathematical simulation during the design process.Keywords: system dynamics, space system ground tests and space qualification, system dynamics identification, satellite attitude control, assembling, integration and testing
Procedia PDF Downloads 1642929 The Use of Digital Stories in the Development of Critical Literacy
Authors: Victoria Zenotz
Abstract:
For Fairclough (1989) critical literacy is a tool to enable readers and writers to build up meaning in discourse. More recently other authors (Leu et al., 2004) have included the new technology context in their definition of literacy. In their view being literate nowadays means to “successfully use and adapt to the rapidly changing information and communication technologies and contexts that continuously emerge in our world and influence all areas of our personal and professional lives.” (Leu et al., 2004: 1570). In this presentation the concept of critical literacy will be related to the creation of digital stories. In the first part of the presentation concepts such as literacy and critical literacy are examined. We consider that real social practices will help learners may improve their literacy level. Accordingly, we show some research, which was conducted at a secondary school in the north of Spain (2013-2014), to illustrate how the “writing” of digital stories may contribute to the development of critical literacy. The use of several instruments allowed the collection of data at the different stages of their creative process including watching and commenting models for digital stories, planning a storyboard, creating and selecting images, adding voices and background sounds, editing and sharing the final product. The results offer some valuable insights into learners’ literacy progress.Keywords: literacy, computer assisted language learning, esl
Procedia PDF Downloads 4022928 Spatial Disparity in Education and Medical Facilities: A Case Study of Barddhaman District, West Bengal, India
Authors: Amit Bhattacharyya
Abstract:
The economic scenario of any region does not show the real picture for the measurement of overall development. Therefore, economic development must be accompanied by social development to be able to make an assessment to measure the level of development. The spatial variation with respect to social development has been discussed taking into account the quality of functioning of a social system in a specific area. In this paper, an attempt has been made to study the spatial distribution of social infrastructural facilities and analyze the magnitude of regional disparities at inter- block level in Barddhman district. It starts with the detailed account of the selection process of social infrastructure indicators and describes the methodology employed in the empirical analysis. Analyzing the block level data, this paper tries to identify the disparity among the blocks in the levels of social development. The results have been subsequently explained using both statistical analysis and geo spatial technique. The paper reveals that the social development is not going on at the same rate in every part of the district. Health facilities and educational facilities are concentrated at some selected point. So overall development activities come to be concentrated in a few centres and the disparity is seen over the blocks.Keywords: disparity, inter-block, social development, spatial variation
Procedia PDF Downloads 1702927 Dynamic Correlations and Portfolio Optimization between Islamic and Conventional Equity Indexes: A Vine Copula-Based Approach
Authors: Imen Dhaou
Abstract:
This study examines conditional Value at Risk by applying the GJR-EVT-Copula model, and finds the optimal portfolio for eight Dow Jones Islamic-conventional pairs. Our methodology consists of modeling the data by a bivariate GJR-GARCH model in which we extract the filtered residuals and then apply the Peak over threshold model (POT) to fit the residual tails in order to model marginal distributions. After that, we use pair-copula to find the optimal portfolio risk dependence structure. Finally, with Monte Carlo simulations, we estimate the Value at Risk (VaR) and the conditional Value at Risk (CVaR). The empirical results show the VaR and CVaR values for an equally weighted portfolio of Dow Jones Islamic-conventional pairs. In sum, we found that the optimal investment focuses on Islamic-conventional US Market index pairs because of high investment proportion; however, all other index pairs have low investment proportion. These results deliver some real repercussions for portfolio managers and policymakers concerning to optimal asset allocations, portfolio risk management and the diversification advantages of these markets.Keywords: CVaR, Dow Jones Islamic index, GJR-GARCH-EVT-pair copula, portfolio optimization
Procedia PDF Downloads 2582926 Polarity Classification of Social Media Comments in Turkish
Authors: Migena Ceyhan, Zeynep Orhan, Dimitrios Karras
Abstract:
People in modern societies are continuously sharing their experiences, emotions, and thoughts in different areas of life. The information reaches almost everyone in real-time and can have an important impact in shaping people’s way of living. This phenomenon is very well recognized and advantageously used by the market representatives, trying to earn the most from this means. Given the abundance of information, people and organizations are looking for efficient tools that filter the countless data into important information, ready to analyze. This paper is a modest contribution in this field, describing the process of automatically classifying social media comments in the Turkish language into positive or negative. Once data is gathered and preprocessed, feature sets of selected single words or groups of words are build according to the characteristics of language used in the texts. These features are used later to train, and test a system according to different machine learning algorithms (Naïve Bayes, Sequential Minimal Optimization, J48, and Bayesian Linear Regression). The resultant high accuracies can be important feedback for decision-makers to improve the business strategies accordingly.Keywords: feature selection, machine learning, natural language processing, sentiment analysis, social media reviews
Procedia PDF Downloads 1482925 Thermal Characterisation of Multi-Coated Lightweight Brake Rotors for Passenger Cars
Authors: Ankit Khurana
Abstract:
The sufficient heat storage capacity or ability to dissipate heat is the most decisive parameter to have an effective and efficient functioning of Friction-based Brake Disc systems. The primary aim of the research was to analyse the effect of multiple coatings on lightweight disk rotors surface which not only alleviates the mass of vehicle & also, augments heat transfer. This research is projected to aid the automobile fraternity with an enunciated view over the thermal aspects in a braking system. The results of the project indicate that with the advent of modern coating technologies a brake system’s thermal curtailments can be removed and together with forced convection, heat transfer processes can see a drastic improvement leading to increased lifetime of the brake rotor. Other advantages of modifying the surface of a lightweight rotor substrate will be to reduce the overall weight of the vehicle, decrease the risk of thermal brake failure (brake fade and fluid vaporization), longer component life, as well as lower noise and vibration characteristics. A mathematical model was constructed in MATLAB which encompassing the various thermal characteristics of the proposed coatings and substrate materials required to approximate the heat flux values in a free and forced convection environment; resembling to a real-time braking phenomenon which could easily be modelled into a full cum scaled version of the alloy brake rotor part in ABAQUS. The finite element of a brake rotor was modelled in a constrained environment such that the nodal temperature between the contact surfaces of the coatings and substrate (Wrought Aluminum alloy) resemble an amalgamated solid brake rotor element. The initial results obtained were for a Plasma Electrolytic Oxidized (PEO) substrate wherein the Aluminum alloy gets a hard ceramic oxide layer grown on its transitional phase. The rotor was modelled and then evaluated in real-time for a constant ‘g’ braking event (based upon the mathematical heat flux input and convective surroundings), which reflected the necessity to deposit a conducting coat (sacrificial) above the PEO layer in order to inhibit thermal degradation of the barrier coating prematurely. Taguchi study was then used to bring out certain critical factors which may influence the maximum operating temperature of a multi-coated brake disc by simulating brake tests: a) an Alpine descent lasting 50 seconds; b) an Autobahn stop lasting 3.53 seconds; c) a Six–high speed repeated stop in accordance to FMVSS 135 lasting 46.25 seconds. Thermal Barrier coating thickness and Vane heat transfer coefficient were the two most influential factors and owing to their design and manufacturing constraints a final optimized model was obtained which survived the 6-high speed stop test as per the FMVSS -135 specifications. The simulation data highlighted the merits for preferring Wrought Aluminum alloy 7068 over Grey Cast Iron and Aluminum Metal Matrix Composite in coherence with the multiple coating depositions.Keywords: lightweight brakes, surface modification, simulated braking, PEO, aluminum
Procedia PDF Downloads 4112924 Visualized Flow Patterns around and inside a Two-Sided Wind-Catcher in the Presence of Upstream Structures
Authors: M. Afshin, A. Sohankar, M. Dehghan Manshadi, M. R. Daneshgar, G. R. Dehghan Kamaragi
Abstract:
In this paper, the influence of an upstream structure on the flow pattern within and around the wind-catcher is experimentally investigated by smoke flow visualization techniques. Wind-catchers are an important part of natural ventilation in residential buildings or public places such as shopping centers, libraries, etc. Wind-catchers might be also used in places of high urban densities; hence their potential to provide natural ventilation in this case is dependent on the presence of upstream objects. In this study, the two-sided wind-catcher model was based on a real wind-catcher observed in the city of Yazd, Iran. The present study focuses on the flow patterns inside and outside the isolated two-sided wind-catcher, and on a two-sided wind-catcher in the presence of an upstream structure. The results show that the presence of an upstream structure influences the airflow pattern force and direction. Placing a high upstream object reverses the airflow direction inside the wind-catcher.Keywords: natural ventilation, smoke flow visualization, two-sided wind-catcher, flow patterns
Procedia PDF Downloads 5782923 Routing Protocol in Ship Dynamic Positioning Based on WSN Clustering Data Fusion System
Authors: Zhou Mo, Dennis Chow
Abstract:
In the dynamic positioning system (DPS) for vessels, the reliable information transmission between each note basically relies on the wireless protocols. From the perspective of cluster-based routing protocols for wireless sensor networks, the data fusion technology based on the sleep scheduling mechanism and remaining energy in network layer is proposed, which applies the sleep scheduling mechanism to the routing protocols, considering the remaining energy of node and location information when selecting cluster-head. The problem of uneven distribution of nodes in each cluster is solved by the Equilibrium. At the same time, Classified Forwarding Mechanism as well as Redelivery Policy strategy is adopted to avoid congestion in the transmission of huge amount of data, reduce the delay in data delivery and enhance the real-time response. In this paper, a simulation test is conducted to improve the routing protocols, which turn out to reduce the energy consumption of nodes and increase the efficiency of data delivery.Keywords: DPS for vessel, wireless sensor network, data fusion, routing protocols
Procedia PDF Downloads 5272922 Model-Based Automotive Partitioning and Mapping for Embedded Multicore Systems
Authors: Robert Höttger, Lukas Krawczyk, Burkhard Igel
Abstract:
This paper introduces novel approaches to partitioning and mapping in terms of model-based embedded multicore system engineering and further discusses benefits, industrial relevance and features in common with existing approaches. In order to assess and evaluate results, both approaches have been applied to a real industrial application as well as to various prototypical demonstrative applications, that have been developed and implemented for different purposes. Evaluations show, that such applications improve significantly according to performance, energy efficiency, meeting timing constraints and covering maintaining issues by using the AMALTHEA platform and the implemented approaches. Further- more, the model-based design provides an open, expandable, platform independent and scalable exchange format between OEMs, suppliers and developers on different levels. Our proposed mechanisms provide meaningful multicore system utilization since load balancing by means of partitioning and mapping is effectively performed with regard to the modeled systems including hardware, software, operating system, scheduling, constraints, configuration and more data.Keywords: partitioning, mapping, distributed systems, scheduling, embedded multicore systems, model-based, system analysis
Procedia PDF Downloads 6232921 Deep-Learning Coupled with Pragmatic Categorization Method to Classify the Urban Environment of the Developing World
Authors: Qianwei Cheng, A. K. M. Mahbubur Rahman, Anis Sarker, Abu Bakar Siddik Nayem, Ovi Paul, Amin Ahsan Ali, M. Ashraful Amin, Ryosuke Shibasaki, Moinul Zaber
Abstract:
Thomas Friedman, in his famous book, argued that the world in this 21st century is flat and will continue to be flatter. This is attributed to rapid globalization and the interdependence of humanity that engendered tremendous in-flow of human migration towards the urban spaces. In order to keep the urban environment sustainable, policy makers need to plan based on extensive analysis of the urban environment. With the advent of high definition satellite images, high resolution data, computational methods such as deep neural network analysis, and hardware capable of high-speed analysis; urban planning is seeing a paradigm shift. Legacy data on urban environments are now being complemented with high-volume, high-frequency data. However, the first step of understanding urban space lies in useful categorization of the space that is usable for data collection, analysis, and visualization. In this paper, we propose a pragmatic categorization method that is readily usable for machine analysis and show applicability of the methodology on a developing world setting. Categorization to plan sustainable urban spaces should encompass the buildings and their surroundings. However, the state-of-the-art is mostly dominated by classification of building structures, building types, etc. and largely represents the developed world. Hence, these methods and models are not sufficient for developing countries such as Bangladesh, where the surrounding environment is crucial for the categorization. Moreover, these categorizations propose small-scale classifications, which give limited information, have poor scalability and are slow to compute in real time. Our proposed method is divided into two steps-categorization and automation. We categorize the urban area in terms of informal and formal spaces and take the surrounding environment into account. 50 km × 50 km Google Earth image of Dhaka, Bangladesh was visually annotated and categorized by an expert and consequently a map was drawn. The categorization is based broadly on two dimensions-the state of urbanization and the architectural form of urban environment. Consequently, the urban space is divided into four categories: 1) highly informal area; 2) moderately informal area; 3) moderately formal area; and 4) highly formal area. In total, sixteen sub-categories were identified. For semantic segmentation and automatic categorization, Google’s DeeplabV3+ model was used. The model uses Atrous convolution operation to analyze different layers of texture and shape. This allows us to enlarge the field of view of the filters to incorporate larger context. Image encompassing 70% of the urban space was used to train the model, and the remaining 30% was used for testing and validation. The model is able to segment with 75% accuracy and 60% Mean Intersection over Union (mIoU). In this paper, we propose a pragmatic categorization method that is readily applicable for automatic use in both developing and developed world context. The method can be augmented for real-time socio-economic comparative analysis among cities. It can be an essential tool for the policy makers to plan future sustainable urban spaces.Keywords: semantic segmentation, urban environment, deep learning, urban building, classification
Procedia PDF Downloads 1932920 Proprioceptive Neuromuscular Facilitation Exercises of Upper Extremities Assessment Using Microsoft Kinect Sensor and Color Marker in a Virtual Reality Environment
Authors: M. Owlia, M. H. Azarsa, M. Khabbazan, A. Mirbagheri
Abstract:
Proprioceptive neuromuscular facilitation exercises are a series of stretching techniques that are commonly used in rehabilitation and exercise therapy. Assessment of these exercises for true maneuvering requires extensive experience in this field and could not be down with patients themselves. In this paper, we developed software that uses Microsoft Kinect sensor, a spherical color marker, and real-time image processing methods to evaluate patient’s performance in generating true patterns of movements. The software also provides the patient with a visual feedback by showing his/her avatar in a Virtual Reality environment along with the correct path of moving hand, wrist and marker. Primary results during PNF exercise therapy of a patient in a room environment shows the ability of the system to identify any deviation of maneuvering path and direction of the hand from the one that has been performed by an expert physician.Keywords: image processing, Microsoft Kinect, proprioceptive neuromuscular facilitation, upper extremities assessment, virtual reality
Procedia PDF Downloads 2752919 Explanation and Temporality in International Relations
Authors: Alasdair Stanton
Abstract:
What makes for a good explanation? Twenty years after Wendt’s important treatment of constitution and causation, non-causal explanations (sometimes referred to as ‘understanding’, or ‘descriptive inference’) have become, if not mainstream, at least accepted within International Relations. This article proceeds in two parts: firstly, it examines closely Wendt’s constitutional claims, and while it agrees there is a difference between causal and constitutional, rejects the view that constitutional explanations lack temporality. In fact, this author concludes that a constitutional argument is only possible if it relies upon a more foundational, causal argument. Secondly, through theoretical analysis of the constitutional argument, this research seeks to delineate temporal and non-temporal ways of explaining within International Relations. This article concludes that while the constitutional explanation, like other logical arguments, including comparative, and counter-factual, are not truly non-causal explanations, they are not bound as tightly to the ‘real world’ as temporal arguments such as cause-effect, process tracing, or even interpretivist accounts. However, like mathematical models, non-temporal arguments should aim for empirical testability as well as internal consistency. This work aims to give clear theoretical grounding to those authors using non-temporal arguments, but also to encourage them, and their positivist critics, to engage in thoroughgoing empirical tests.Keywords: causal explanation, constitutional understanding, empirical, temporality
Procedia PDF Downloads 1972918 Robotic Assistance in Nursing Care: Survey on Challenges and Scenarios
Authors: Pascal Gliesche, Kathrin Seibert, Christian Kowalski, Dominik Domhoff, Max Pfingsthorn, Karin Wolf-Ostermann, Andreas Hein
Abstract:
Robotic assistance in nursing care is an increasingly important area of research and development. Facing a shortage of labor and an increasing number of people in need of care, the German Nursing Care Innovation Center (Pflegeinnovationszentrum, PIZ) aims to address these challenges from the side of technology. Little is known about nurses experiences with existing robotic assistance systems. Especially nurses perspectives on starting points for the development of robotic solutions, that target recurring burdensome tasks in everyday nursing care, are of interest. This paper presents findings focusing on robotics resulting from an explanatory mixed-methods study on nurses experiences with and their expectations for innovative technologies in nursing care in stationary and ambulant care facilities and hospitals in Germany. Based on the findings, eight scenarios for robotic assistance are identified based on the real needs of practitioners. An initial system addressing a single use-case is described to show perspectives for the use of robots in nursing care.Keywords: robotics and automation, engineering management, engineering in medicine and biology, medical services, public health-care
Procedia PDF Downloads 1562917 Intelligent Staff Scheduling: Optimizing the Solver with Tabu Search
Authors: Yu-Ping Chiu, Dung-Ying Lin
Abstract:
Traditional staff scheduling methods, relying on employee experience, often lead to inefficiencies and resource waste. The challenges of transferring scheduling expertise and adapting to changing labor regulations further complicate this process. Manual approaches become increasingly impractical as companies accumulate complex scheduling rules over time. This study proposes an algorithmic optimization approach to address these issues, aiming to expedite scheduling while ensuring strict compliance with labor regulations and company policies. The method focuses on generating optimal schedules that minimize weighted company objectives within a compressed timeframe. Recognizing the limitations of conventional commercial software in modeling and solving complex real-world scheduling problems efficiently, this research employs Tabu Search with both long-term and short-term memory structures. The study will present numerical results and managerial insights to demonstrate the effectiveness of this approach in achieving intelligent and efficient staff scheduling.Keywords: intelligent memory structures, mixed integer programming, meta-heuristics, staff scheduling problem, tabu search
Procedia PDF Downloads 282916 Fused Salt Electrolysis of Rare-Earth Materials from the Domestic Ore and Preparation of Rare-Earth Hydrogen Storage Alloys
Authors: Jeong-Hyun Yoo, Hanjung Kwon, Sung-Wook Cho
Abstract:
Fused salt electrolysis was studied to make the high purity rare-earth metals using domestic rare-earth ore. The target metals of the fused salt electrolysis were Mm (Misch metal), La, Ce, Nd, etc. Fused salt electrolysis was performed with the supporting salt such as chloride and fluoride at the various temperatures and ampere. The metals made by fused salt electrolysis were analyzed to identify the phase and composition using the methods of XRD and ICP. As a result, the acquired rare-earth metals were the high purity ones which had more than 99% purity. Also, VIM (vacuum induction melting) was studied to make the kg level rare-earth alloy for the use of secondary battery and hydrogen storage. In order to indentify the physicochemical properties such as phase, impurity gas, alloy composition and hydrogen storage, the alloys were investigated. The battery characteristics were also analyzed through the various tests in the real production line of a battery company.Keywords: domestic rare-earth ore, fused salt electrolysis, rare-earth materials, hydrogen storage alloy, secondary battery
Procedia PDF Downloads 5352915 Actual Fracture Length Determination Using a Technique for Shale Fracturing Data Analysis in Real Time
Authors: M. Wigwe, M. Y Soloman, E. Pirayesh, R. Eghorieta, N. Stegent
Abstract:
The moving reference point (MRP) technique has been used in the analyses of the first three stages of two fracturing jobs. The results obtained verify the proposition that a hydraulic fracture in shale grows in spurts rather than in a continuous pattern as originally interpreted by Nolte-Smith technique. Rather than a continuous Mode I fracture that is followed by Mode II, III or IV fractures, these fracture modes could alternate throughout the pumping period. It is also shown that the Nolte-Smith time parameter plot can be very helpful in identifying the presence of natural fractures that have been intersected by the hydraulic fracture. In addition, with the aid of a fracture length-time plot generated from any fracture simulation that matches the data, the distance from the wellbore to the natural fractures, which also translates to the actual fracture length for the stage, can be determined. An algorithm for this technique is developed. This procedure was used for the first 9 minutes of the simulated frac job data. It was observed that after 7mins, the actual fracture length is about 150ft, instead of 250ft predicted by the simulator output. This difference gets larger as the analysis proceeds.Keywords: shale, fracturing, reservoir, simulation, frac-length, moving-reference-point
Procedia PDF Downloads 7582914 Structure of Turbulence Flow in the Wire-Wrappes Fuel Assemblies of BREST-OD-300
Authors: Dmitry V. Fomichev, Vladimir I. Solonin
Abstract:
In this paper, experimental and numerical study of hydrodynamic characteristics of the air coolant flow in the test wire-wrapped assembly is presented. The test assembly has 37 rods, which are similar to the real fuel pins of the BREST-OD-300 fuel assemblies geometrically. Air open loop test facility installed at the “Nuclear Power Plants and Installations” department of BMSTU was used to obtain the experimental data. The obtaining altitudinal distribution of static pressure in the near-wall test assembly as well as velocity and temperature distribution of coolant flow in the test sections can give us some new knowledge about the mechanism of formation of the turbulence flow structure in the wire wrapped fuel assemblies. Numerical simulations of the turbulence flow has been accomplished using ANSYS Fluent 14.5. Different non-local turbulence models have been considered, such as standard and RNG k-e models and k-w SST model. Results of numerical simulations of the flow based on the considered turbulence models give the best agreement with the experimental data and help us to carry out strong analysis of flow characteristics.Keywords: wire-spaces fuel assembly, turbulent flow structure, computation fluid dynamics
Procedia PDF Downloads 4602913 Integrating HOTS Activities with Geogebra in Pre-Service Teachers' Preparation
Authors: Wajeeh Daher, Nimer Baya'a
Abstract:
High Order Thinking Skills (HOTS) are suggested today as essential for the cognitive development of students and as preparing them for real life skills. Teachers are encouraged to use HOTS activities in the classroom to help their students develop higher order skills and deep thinking. So it is essential to prepare pre-service teachers to write and use HOTS activities for their students. This paper describes a model for integrating HOTS activities with GeoGebra in pre-service teachers’ preparation. This model describes four aspects of HOTS activities and working with them: Activity components, preparation procedure, strategies and processes used in writing a HOTS activity and types of the HOTS activities. In addition, the paper describes the pre-service teachers' difficulties in preparing and working with HOTS activities, as well as their perceptions regarding the use of these activities and GeoGebra in the mathematics classroom. The paper also describes the contribution of a HOTS activity to pupils' learning of mathematics, where this HOTS activity was prepared and taught by one pre-service teacher.Keywords: high order thinking skills, HOTS activities, pre-service teachers, professional development
Procedia PDF Downloads 3502912 Assessment of Natural Flood Management Potential of Sheffield Lakeland to Flood Risks Using GIS: A Case Study of Selected Farms on the Upper Don Catchment
Authors: Samuel Olajide Babawale, Jonathan Bridge
Abstract:
Natural Flood Management (NFM) is promoted as part of sustainable flood management (SFM) in response to climate change adaptation. Stakeholder engagement is central to this approach, and current trends are progressively moving towards a collaborative learning approach where stakeholder participation is perceived as one of the indicators of sustainable development. Within this methodology, participation embraces a diversity of knowledge and values underpinned by a philosophy of empowerment, equity, trust, and learning. To identify barriers to NFM uptake, there is a need for a new understanding of how stakeholder participation could be enhanced to benefit individual and community resilience within SFM. This is crucial in light of climate change threats and scientific reliability concerns. In contributing to this new understanding, this research evaluated the proposed interventions on six (6) UK NFM in a catchment known as the Sheffield Lakeland Partnership Area with reference to the Environment Agency Working with Natural Processes (WWNP) Potentials/Opportunities. Three of the opportunities, namely Run-off Attenuation Potential of 1%, Run-off Attenuation Potential of 3.3% and Riparian Woodland Potential, were modeled. In all the models, the interventions, though they have been proposed or already in place, are not in agreement with the data presented by EA WWNP. Findings show some institutional weaknesses, which are seen to inhibit the development of adequate flood management solutions locally with damaging implications for vulnerable communities. The gap in communication from practitioners poses a challenge to the implementation of real flood mitigating measures that align with the lead agency’s nationally accepted measures which are identified as not feasible by the farm management officers within this context. Findings highlight a dominant top-bottom approach to management with very minimal indication of local interactions. Current WWNP opportunities have been termed as not realistic by the people directly involved in the daily management of the farms, with less emphasis on prevention and mitigation. The targeted approach suggested by the EA WWNP is set against adaptive flood management and community development. The study explores dimensions of participation using the self-reliance and self-help approach to develop a methodology that facilitates reflections of currently institutionalized practices and the need to reshape spaces of interactions to enable empowered and meaningful participation. Stakeholder engagement and resilience planning underpin this research. The findings of the study suggest different agencies have different perspectives on “community participation”. It also shows communities in the case study area appear to be least influential, denied a real chance of discussing their situations and influencing the decisions. This is against the background that the communities are in the most productive regions, contributing massively to national food supplies. The results are discussed concerning practical implications for addressing interagency partnerships and conducting grassroots collaborations that empower local communities and seek solutions to sustainable development challenges. This study takes a critical look into the challenges and progress made locally in sustainable flood risk management and adaptation to climate change by the United Kingdom towards achieving the global 2030 agenda for sustainable development.Keywords: natural flood management, sustainable flood management, sustainable development, working with natural processes, environment agency, run-off attenuation potential, climate change
Procedia PDF Downloads 752911 A Low-Cost Air Quality Monitoring Internet of Things Platform
Authors: Christos Spandonidis, Stefanos Tsantilas, Elias Sedikos, Nektarios Galiatsatos, Fotios Giannopoulos, Panagiotis Papadopoulos, Nikolaos Demagos, Dimitrios Reppas, Christos Giordamlis
Abstract:
In the present paper, a low cost, compact and modular Internet of Things (IoT) platform for air quality monitoring in urban areas is presented. This platform comprises of dedicated low cost, low power hardware and the associated embedded software that enable measurement of particles (PM2.5 and PM10), NO, CO, CO2 and O3 concentration in the air, along with relative temperature and humidity. This integrated platform acts as part of a greater air pollution data collecting wireless network that is able to monitor the air quality in various regions and neighborhoods of an urban area, by providing sensor measurements at a high rate that reaches up to one sample per second. It is therefore suitable for Big Data analysis applications such as air quality forecasts, weather forecasts and traffic prediction. The first real world test for the developed platform took place in Thessaloniki, Greece, where 16 devices were installed in various buildings in the city. In the near future, many more of these devices are going to be installed in the greater Thessaloniki area, giving a detailed air quality map of the city.Keywords: distributed sensor system, environmental monitoring, Internet of Things, smart cities
Procedia PDF Downloads 1482910 The Influence of Social Media on Gym Memberships in the UAE
Authors: Mohammad Obeidat
Abstract:
In recent years, social media has revolutionized the way businesses market their products and services. Platforms such as Instagram, Facebook, YouTube, and TikTok have become powerful tools for reaching large audiences and engaging with consumers in real-time. These platforms allow businesses to create visually appealing content, interact with customers, and leverage user-generated content to enhance brand visibility and credibility. Recent statistics indicate that businesses that actively participate in social media marketing see improvements in brand visibility, customer engagement, and revenue generation. For example, several studies reveal that 70% of business-to-consumer marketers have gained customers through Facebook. This study aims to contribute to the academic literature on social media marketing and consumer behavior, specifically within the context of the fitness industry in the UAE. The findings will provide valuable insights for gym and fitness center managers, marketers, and social media strategists looking to enhance their engagement with potential customers. By understanding the impact of social media on purchasing decisions, businesses can tailor their marketing efforts to meet consumer expectations better and drive membership growth.Keywords: social media, consumer behavior, digital native, influencer
Procedia PDF Downloads 532909 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet
Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel
Abstract:
Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network
Procedia PDF Downloads 2282908 BIM Modeling of Site and Existing Buildings: Case Study of ESTP Paris Campus
Authors: Rita Sassine, Yassine Hassani, Mohamad Al Omari, Stéphanie Guibert
Abstract:
Building Information Modelling (BIM) is the process of creating, managing, and centralizing information during the building lifecycle. BIM can be used all over a construction project, from the initiation phase to the planning and execution phases to the maintenance and lifecycle management phase. For existing buildings, BIM can be used for specific applications such as lifecycle management. However, most of the existing buildings don’t have a BIM model. Creating a compatible BIM for existing buildings is very challenging. It requires special equipment for data capturing and efforts to convert these data into a BIM model. The main difficulties for such projects are to define the data needed, the level of development (LOD), and the methodology to be adopted. In addition to managing information for an existing building, studying the impact of the built environment is a challenging topic. So, integrating the existing terrain that surrounds buildings into the digital model is essential to be able to make several simulations as flood simulation, energy simulation, etc. Making a replication of the physical model and updating its information in real-time to make its Digital Twin (DT) is very important. The Digital Terrain Model (DTM) represents the ground surface of the terrain by a set of discrete points with unique height values over 2D points based on reference surface (e.g., mean sea level, geoid, and ellipsoid). In addition, information related to the type of pavement materials, types of vegetation and heights and damaged surfaces can be integrated. Our aim in this study is to define the methodology to be used in order to provide a 3D BIM model for the site and the existing building based on the case study of “Ecole Spéciale des Travaux Publiques (ESTP Paris)” school of engineering campus. The property is located on a hilly site of 5 hectares and is composed of more than 20 buildings with a total area of 32 000 square meters and a height between 50 and 68 meters. In this work, the campus precise levelling grid according to the NGF-IGN69 altimetric system and the grid control points are computed according to (Réseau Gédésique Français) RGF93 – Lambert 93 french system with different methods: (i) Land topographic surveying methods using robotic total station, (ii) GNSS (Global Network Satellite sytem) levelling grid with NRTK (Network Real Time Kinematic) mode, (iii) Point clouds generated by laser scanning. These technologies allow the computation of multiple building parameters such as boundary limits, the number of floors, the floors georeferencing, the georeferencing of the 4 base corners of each building, etc. Once the entry data are identified, the digital model of each building is done. The DTM is also modeled. The process of altimetric determination is complex and requires efforts in order to collect and analyze multiple data formats. Since many technologies can be used to produce digital models, different file formats such as DraWinG (DWG), LASer (LAS), Comma-separated values (CSV), Industry Foundation Classes (IFC) and ReViT (RVT) will be generated. Checking the interoperability between BIM models is very important. In this work, all models are linked together and shared on 3DEXPERIENCE collaborative platform.Keywords: building information modeling, digital terrain model, existing buildings, interoperability
Procedia PDF Downloads 1152907 Estimating Algae Concentration Based on Deep Learning from Satellite Observation in Korea
Authors: Heewon Jeong, Seongpyo Kim, Joon Ha Kim
Abstract:
Over the last few tens of years, the coastal regions of Korea have experienced red tide algal blooms, which are harmful and toxic to both humans and marine organisms due to their potential threat. It was accelerated owing to eutrophication by human activities, certain oceanic processes, and climate change. Previous studies have tried to monitoring and predicting the algae concentration of the ocean with the bio-optical algorithms applied to color images of the satellite. However, the accurate estimation of algal blooms remains problems to challenges because of the complexity of coastal waters. Therefore, this study suggests a new method to identify the concentration of red tide algal bloom from images of geostationary ocean color imager (GOCI) which are representing the water environment of the sea in Korea. The method employed GOCI images, which took the water leaving radiances centered at 443nm, 490nm and 660nm respectively, as well as observed weather data (i.e., humidity, temperature and atmospheric pressure) for the database to apply optical characteristics of algae and train deep learning algorithm. Convolution neural network (CNN) was used to extract the significant features from the images. And then artificial neural network (ANN) was used to estimate the concentration of algae from the extracted features. For training of the deep learning model, backpropagation learning strategy is developed. The established methods were tested and compared with the performances of GOCI data processing system (GDPS), which is based on standard image processing algorithms and optical algorithms. The model had better performance to estimate algae concentration than the GDPS which is impossible to estimate greater than 5mg/m³. Thus, deep learning model trained successfully to assess algae concentration in spite of the complexity of water environment. Furthermore, the results of this system and methodology can be used to improve the performances of remote sensing. Acknowledgement: This work was supported by the 'Climate Technology Development and Application' research project (#K07731) through a grant provided by GIST in 2017.Keywords: deep learning, algae concentration, remote sensing, satellite
Procedia PDF Downloads 1862906 Emotion Oriented Students' Opinioned Topic Detection for Course Reviews in Massive Open Online Course
Authors: Zhi Liu, Xian Peng, Monika Domanska, Lingyun Kang, Sannyuya Liu
Abstract:
Massive Open education has become increasingly popular among worldwide learners. An increasing number of course reviews are being generated in Massive Open Online Course (MOOC) platform, which offers an interactive feedback channel for learners to express opinions and feelings in learning. These reviews typically contain subjective emotion and topic information towards the courses. However, it is time-consuming to artificially detect these opinions. In this paper, we propose an emotion-oriented topic detection model to automatically detect the students’ opinioned aspects in course reviews. The known overall emotion orientation and emotional words in each review are used to guide the joint probabilistic modeling of emotion and aspects in reviews. Through the experiment on real-life review data, it is verified that the distribution of course-emotion-aspect can be calculated to capture the most significant opinioned topics in each course unit. This proposed technique helps in conducting intelligent learning analytics for teachers to improve pedagogies and for developers to promote user experiences.Keywords: Massive Open Online Course (MOOC), course reviews, topic model, emotion recognition, topical aspects
Procedia PDF Downloads 2642905 Extension of Moral Agency to Artificial Agents
Authors: Sofia Quaglia, Carmine Di Martino, Brendan Tierney
Abstract:
Artificial Intelligence (A.I.) constitutes various aspects of modern life, from the Machine Learning algorithms predicting the stocks on Wall streets to the killing of belligerents and innocents alike on the battlefield. Moreover, the end goal is to create autonomous A.I.; this means that the presence of humans in the decision-making process will be absent. The question comes naturally: when an A.I. does something wrong when its behavior is harmful to the community and its actions go against the law, which is to be held responsible? This research’s subject matter in A.I. and Robot Ethics focuses mainly on Robot Rights and its ultimate objective is to answer the questions: (i) What is the function of rights? (ii) Who is a right holder, what is personhood and the requirements needed to be a moral agent (therefore, accountable for responsibility)? (iii) Can an A.I. be a moral agent? (ontological requirements) and finally (iv) if it ought to be one (ethical implications). With the direction to answer this question, this research project was done via a collaboration between the School of Computer Science in the Technical University of Dublin that oversaw the technical aspects of this work, as well as the Department of Philosophy in the University of Milan, who supervised the philosophical framework and argumentation of the project. Firstly, it was found that all rights are positive and based on consensus; they change with time based on circumstances. Their function is to protect the social fabric and avoid dangerous situations. The same goes for the requirements considered necessary to be a moral agent: those are not absolute; in fact, they are constantly redesigned. Hence, the next logical step was to identify what requirements are regarded as fundamental in real-world judicial systems, comparing them to that of ones used in philosophy. Autonomy, free will, intentionality, consciousness and responsibility were identified as the requirements to be considered a moral agent. The work went on to build a symmetrical system between personhood and A.I. to enable the emergence of the ontological differences between the two. Each requirement is introduced, explained in the most relevant theories of contemporary philosophy, and observed in its manifestation in A.I. Finally, after completing the philosophical and technical analysis, conclusions were drawn. As underlined in the research questions, there are two issues regarding the assignment of moral agency to artificial agent: the first being that all the ontological requirements must be present and secondly being present or not, whether an A.I. ought to be considered as an artificial moral agent. From an ontological point of view, it is very hard to prove that an A.I. could be autonomous, free, intentional, conscious, and responsible. The philosophical accounts are often very theoretical and inconclusive, making it difficult to fully detect these requirements on an experimental level of demonstration. However, from an ethical point of view it makes sense to consider some A.I. as artificial moral agents, hence responsible for their own actions. When considering artificial agents as responsible, there can be applied already existing norms in our judicial system such as removing them from society, and re-educating them, in order to re-introduced them to society. This is in line with how the highest profile correctional facilities ought to work. Noticeably, this is a provisional conclusion and research must continue further. Nevertheless, the strength of the presented argument lies in its immediate applicability to real world scenarios. To refer to the aforementioned incidents, involving the murderer of innocents, when this thesis is applied it is possible to hold an A.I. accountable and responsible for its actions. This infers removing it from society by virtue of its un-usability, re-programming it and, only when properly functioning, re-introducing it successfullyKeywords: artificial agency, correctional system, ethics, natural agency, responsibility
Procedia PDF Downloads 1902904 Evaluating the effects of Gas Injection on Enhanced Gas-Condensate Recovery and Reservoir Pressure Maintenance
Authors: F. S. Alavi, D. Mowla, F. Esmaeilzadeh
Abstract:
In this paper, the Eclipse 300 simulator was used to perform compositional modeling of gas injection process for enhanced condensate recovery of a real gas condensate well in south of Iran here referred to as SA4. Some experimental data were used to tune the Peng-Robinson equation of state for this case. Different scenarios of gas injection at current reservoir pressure and at abandonment reservoir pressure had been considered with different gas compositions. Methane, carbon dioxide, nitrogen and two other gases with specified compositions were considered as potential gases for injection. According to the obtained results, nitrogen leads to highest pressure maintenance in the reservoir but methane results in highest condensate recovery among the selected injection gases. At low injection rates, condensate recovery percent is strongly affected by gas injection rate but this dependency shifts to zero at high injection rates. Condensate recovery is higher in all cases of injection at current reservoir pressure than injection at abandonment pressure. Using a constant injection rate, increasing the production well bottom hole pressure results in increasing the condensate recovery percent and time of gas breakthrough.Keywords: gas-condensate reservoir, case-study, compositional modelling, enhanced condensate recovery, gas injection
Procedia PDF Downloads 2002903 Using Computational Fluid Dynamics to Model and Design a Preventative Application for Strong Wind
Authors: Ming-Hwi Yao, Su-Szu Yang
Abstract:
Typhoons are one of the major types of disasters that affect Taiwan each year and that cause severe damage to agriculture. Indeed, the damage exacted during a typical typhoon season can be up to $1 billion, and is responsible for nearly 75% of yearly agricultural losses. However, there is no consensus on how to reduce the damage caused by the strong winds and heavy precipitation engendered by typhoons. One suggestion is the use of windbreak nets, which are a low-cost and easy-to-use disaster mitigation strategy for crop production. In the present study, we conducted an evaluation to determine the optimal conditions of a windbreak net by using a computational fluid dynamics (CFD) model. This model may be used as a reference for crop protection. The results showed that CFD simulation validated windbreak nets of different mesh sizes and heights in the experimental area; thus, CFD is an efficient tool for evaluating the effectiveness of windbreak nets. Specifically, the effective wind protection length and height were found to be 6 and 1.3 times the length and height of the windbreak net, respectively. During a real typhoon, maximum wind gusts of 18 m s-1 can be reduced to 4 m s-1 by using a windbreak net that has a 70% blocking rate. In short, windbreak nets are significantly effective in protecting typhoon-affected areas.Keywords: computational fluid dynamics, disaster, typhoon, windbreak net
Procedia PDF Downloads 1932902 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters
Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu
Abstract:
Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning
Procedia PDF Downloads 2042901 Reinforcement Learning for Quality-Oriented Production Process Parameter Optimization Based on Predictive Models
Authors: Akshay Paranjape, Nils Plettenberg, Robert Schmitt
Abstract:
Producing faulty products can be costly for manufacturing companies and wastes resources. To reduce scrap rates in manufacturing, process parameters can be optimized using machine learning. Thus far, research mainly focused on optimizing specific processes using traditional algorithms. To develop a framework that enables real-time optimization based on a predictive model for an arbitrary production process, this study explores the application of reinforcement learning (RL) in this field. Based on a thorough review of literature about RL and process parameter optimization, a model based on maximum a posteriori policy optimization that can handle both numerical and categorical parameters is proposed. A case study compares the model to state–of–the–art traditional algorithms and shows that RL can find optima of similar quality while requiring significantly less time. These results are confirmed in a large-scale validation study on data sets from both production and other fields. Finally, multiple ways to improve the model are discussed.Keywords: reinforcement learning, production process optimization, evolutionary algorithms, policy optimization, actor critic approach
Procedia PDF Downloads 101