Search results for: coherent covering location problem
7416 Comparison of Finite Difference Schemes for Numerical Study of Ripa Model
Authors: Sidrah Ahmed
Abstract:
The river and lakes flows are modeled mathematically by shallow water equations that are depth-averaged Reynolds Averaged Navier-Stokes equations under Boussinesq approximation. The temperature stratification dynamics influence the water quality and mixing characteristics. It is mainly due to the atmospheric conditions including air temperature, wind velocity, and radiative forcing. The experimental observations are commonly taken along vertical scales and are not sufficient to estimate small turbulence effects of temperature variations induced characteristics of shallow flows. Wind shear stress over the water surface influence flow patterns, heat fluxes and thermodynamics of water bodies as well. Hence it is crucial to couple temperature gradients with shallow water model to estimate the atmospheric effects on flow patterns. The Ripa system has been introduced to study ocean currents as a variant of shallow water equations with addition of temperature variations within the flow. Ripa model is a hyperbolic system of partial differential equations because all the eigenvalues of the system’s Jacobian matrix are real and distinct. The time steps of a numerical scheme are estimated with the eigenvalues of the system. The solution to Riemann problem of the Ripa model is composed of shocks, contact and rarefaction waves. Solving Ripa model with Riemann initial data with the central schemes is difficult due to the eigen structure of the system.This works presents the comparison of four different finite difference schemes for the numerical solution of Riemann problem for Ripa model. These schemes include Lax-Friedrichs, Lax-Wendroff, MacCormack scheme and a higher order finite difference scheme with WENO method. The numerical flux functions in both dimensions are approximated according to these methods. The temporal accuracy is achieved by employing TVD Runge Kutta method. The numerical tests are presented to examine the accuracy and robustness of the applied methods. It is revealed that Lax-Freidrichs scheme produces results with oscillations while Lax-Wendroff and higher order difference scheme produce quite better results.Keywords: finite difference schemes, Riemann problem, shallow water equations, temperature gradients
Procedia PDF Downloads 2087415 Micro-Texture Effect on Fracture Location in Carbon Steel during Forming
Authors: Sarra Khelifi, Youcef Guerabli, Ahcene Boumaiza
Abstract:
Advances in techniques for measuring individual crystallographic orientations have made it possible to investigate the role of local crystallography during the plastic deformation of materials. In this study, the change in crystallographic orientation distribution during deformation by deep drawing in carbon steel has been investigated in order to understand their role in propagation and arrest of crack. The results show that the change of grain orientation from initial recrystallization texture components of {111}<112> to deformation orientation {111}<110> incites the initiation and propagation of cracks in the region of {111}<112> small grains. Moreover, the misorientation profile and local orientation are analyzed in detail to discuss the change from {111}<112> to {111}<110>. The deformation of the grain with {111}<110> orientation is discussed in terms of stops of the crack in carbon steel during drawing. The SEM-EBSD technique was used to reveal the change of orientation; XRD was performed for the characterization of the global evolution of texture for deformed samples.Keywords: fracture, heterogeneity, misorientation profile, stored energy
Procedia PDF Downloads 2047414 Unfolding Global Biodiversity Patterns of Marine Planktonic Diatom Communities across the World's Oceans
Authors: Shruti Malviya, Chris Bowler
Abstract:
Analysis of microbial eukaryotic diversity is fundamental to understanding ecosystems’ structure, biology, and ecology. Diatoms (Stramenopiles, Bacillariophyceae) are one of the most diverse and ecologically prominent groups of phytoplankton. This study was performed to enhance the understanding of global biodiversity patterns and structure of planktonic diatom communities across the world's oceans. We used the metabarcoding data set generated from the biological samples and associated environmental data collected during the Tara Oceans (2009-2013) global circumnavigation covering all major oceanic provinces. A total of ~18 million diatom V9-18S rDNA tags from 126 sampling stations, constituting 631 size-fractionated plankton communities were generated. Using ~250,000 unique diatom metabarcodes, the global diatom distribution and diversity across size classes, genus and ecological niches was assessed. Notably, our analysis revealed: (i) a new estimate of the total number of planktonic diatom species, (ii) a considerable unknown diversity and exceptionally high diversity in the open ocean, and (iii) complex diversity patterns across oceanic provinces. Also, co-occurrence of several ribotypes in locations separated by great geographic distances (equatorial stations) demonstrated a widespread but not ubiquitous distribution. This work provides a comprehensive perspective on diatom distribution and diversity in the world’s oceans and elaborates interconnections between associated theories and underlying drivers. It shows how meta-barcoding approaches can provide a framework to investigate environmental diversity at a global scale, which is deemed as an essential step in answering various ecological research questions. Consequently, this work also provides a reference point to explore how microbial communities will respond to environmental conditions.Keywords: diatoms, Tara Oceans, biodiversity, metabarcoding
Procedia PDF Downloads 1577413 Evaluation of Broiler Parent Breeds under Libyan Conditions
Authors: Salem A. Abdalla Bozrayda, Abulgasem M. Hubara
Abstract:
The use of commercial poultry breeds in Libya may result in large economic losses because genotypes selected in temperate climates may respond differently to other climate conditions and management. Therefore three commercial breeds (Hypeco, Avian, and Shaver) were evaluated in two regions. The data were obtained from weekly records of three parental flocks for each breed at Ghout El-sultan and Tawargha region. Feed Hen Housed (FHH), Hen Housed Egg Production (HHEP) Mortility % were the studied traits. Statistical model include location, year, month, age and breed. Hypeco produced more HHEP 68.6 with Less FHH 22.9 kg but with higher mortility 8.5 % than Avian and shaver breeds. The breeds exhibited different responses to the different months in Libya. In conclusion, the differences, which exhibited between the breeds in traits studied, indicate that genotype x environment must be considered when select breed to perform under Libyan conditions.Keywords: hypeco avian shaver, feed hen housed, hen housed egg production, mortility, Libya
Procedia PDF Downloads 2947412 User Experience Evaluation on the Usage of Commuter Line Train Ticket Vending Machine
Authors: Faishal Muhammad, Erlinda Muslim, Nadia Faradilla, Sayidul Fikri
Abstract:
To deal with the increase of mass transportation needs problem, PT. Kereta Commuter Jabodetabek (KCJ) implements Commuter Vending Machine (C-VIM) as the solution. For that background, C-VIM is implemented as a substitute to the conventional ticket windows with the purposes to make transaction process more efficient and to introduce self-service technology to the commuter line user. However, this implementation causing problems and long queues when the user is not accustomed to using the machine. The objective of this research is to evaluate user experience after using the commuter vending machine. The goal is to analyze the existing user experience problem and to achieve a better user experience design. The evaluation method is done by giving task scenario according to the features offered by the machine. The features are daily insured ticket sales, ticket refund, and multi-trip card top up. There 20 peoples that separated into two groups of respondents involved in this research, which consist of 5 males and 5 females each group. The experienced and inexperienced user to prove that there is a significant difference between both groups in the measurement. The user experience is measured by both quantitative and qualitative measurement. The quantitative measurement includes the user performance metrics such as task success, time on task, error, efficiency, and learnability. The qualitative measurement includes system usability scale questionnaire (SUS), questionnaire for user interface satisfaction (QUIS), and retrospective think aloud (RTA). Usability performance metrics shows that 4 out of 5 indicators are significantly different in both group. This shows that the inexperienced group is having a problem when using the C-VIM. Conventional ticket windows also show a better usability performance metrics compared to the C-VIM. From the data processing, the experienced group give the SUS score of 62 with the acceptability scale of 'marginal low', grade scale of “D”, and the adjective ratings of 'good' while the inexperienced group gives the SUS score of 51 with the acceptability scale of 'marginal low', grade scale of 'F', and the adjective ratings of 'ok'. This shows that both groups give a low score on the system usability scale. The QUIS score of the experienced group is 69,18 and the inexperienced group is 64,20. This shows the average QUIS score below 70 which indicate a problem with the user interface. RTA was done to obtain user experience issue when using C-VIM through interview protocols. The issue obtained then sorted using pareto concept and diagram. The solution of this research is interface redesign using activity relationship chart. This method resulted in a better interface with an average SUS score of 72,25, with the acceptable scale of 'acceptable', grade scale of 'B', and the adjective ratings of 'excellent'. From the time on task indicator of performance metrics also shows a significant better time by using the new interface design. Result in this study shows that C-VIM not yet have a good performance and user experience.Keywords: activity relationship chart, commuter line vending machine, system usability scale, usability performance metrics, user experience evaluation
Procedia PDF Downloads 2657411 How Is a Machine-Translated Literary Text Organized in Coherence? An Analysis Based upon Theme-Rheme Structure
Abstract:
With the ultimate goal to automatically generate translated texts with high quality, machine translation has made tremendous improvements. However, its translations of literary works are still plagued with problems in coherence, esp. the translation between distant language pairs. One of the causes of the problems is probably the lack of linguistic knowledge to be incorporated into the training of machine translation systems. In order to enable readers to better understand the problems of machine translation in coherence, to seek out the potential knowledge to be incorporated, and thus to improve the quality of machine translation products, this study applies Theme-Rheme structure to examine how a machine-translated literary text is organized and developed in terms of coherence. Theme-Rheme structure in Systemic Functional Linguistics is a useful tool for analysis of textual coherence. Theme is the departure point of a clause and Rheme is the rest of the clause. In a text, as Themes and Rhemes may be connected with each other in meaning, they form thematic and rhematic progressions throughout the text. Based on this structure, we can look into how a text is organized and developed in terms of coherence. Methodologically, we chose Chinese and English as the language pair to be studied. Specifically, we built a comparable corpus with two modes of English translations, viz. machine translation (MT) and human translation (HT) of one Chinese literary source text. The translated texts were annotated with Themes, Rhemes and their progressions throughout the texts. The annotated texts were analyzed from two respects, the different types of Themes functioning differently in achieving coherence, and the different types of thematic and rhematic progressions functioning differently in constructing texts. By analyzing and contrasting the two modes of translations, it is found that compared with the HT, 1) the MT features “pseudo-coherence”, with lots of ill-connected fragments of information using “and”; 2) the MT system produces a static and less interconnected text that reads like a list; these two points, in turn, lead to the less coherent organization and development of the MT than that of the HT; 3) novel to traditional and previous studies, Rhemes do contribute to textual connection and coherence though less than Themes do and thus are worthy of notice in further studies. Hence, the findings suggest that Theme-Rheme structure be applied to measuring and assessing the coherence of machine translation, to being incorporated into the training of the machine translation system, and Rheme be taken into account when studying the textual coherence of both MT and HT.Keywords: coherence, corpus-based, literary translation, machine translation, Theme-Rheme structure
Procedia PDF Downloads 2117410 Return to Work after a Mental Health Problem: Analysis of Two Different Management Models
Authors: Lucie Cote, Sonia McFadden
Abstract:
Mental health problems in the workplace are currently one of the main causes of absences. Research work has highlighted the importance of a collaborative process involving the stakeholders in the return-to-work process and has established the best management practices to ensure a successful return-to-work. However, very few studies have specifically explored the combination of various management models and determined whether they could satisfy the needs of the stakeholders. The objective of this study is to analyze two models for managing the return to work: the ‘medical-administrative’ and the ‘support of the worker’ in order to understand the actions and actors involved in these models. The study also aims to explore whether these models meet the needs of the actors involved in the management of the return to work. A qualitative case study was conducted in a Canadian federal organization. An abundant internal documentation and semi-directed interviews with six managers, six workers and four human resources professionals involved in the management of records of employees returning to work after a mental health problem resulted in a complete picture of the return to work management practices used in this organization. The triangulation of this data facilitated the examination of the benefits and limitations of each approach. The results suggest that the actions of management for employee return to work from both models of management ‘support of the worker’ and ‘medical-administrative’ are compatible and can meet the needs of the actors involved in the return to work. More research is needed to develop a structured model integrating best practices of the two approaches to ensure the success of the return to work.Keywords: return to work, mental health, management models, organizations
Procedia PDF Downloads 2157409 Optimal Design of Reference Node Placement for Wireless Indoor Positioning Systems in Multi-Floor Building
Authors: Kittipob Kondee, Chutima Prommak
Abstract:
In this paper, we propose an optimization technique that can be used to optimize the placements of reference nodes and improve the location determination performance for the multi-floor building. The proposed technique is based on Simulated Annealing algorithm (SA) and is called MSMR-M. The performance study in this work is based on simulation. We compare other node-placement techniques found in the literature with the optimal node-placement solutions obtained from our optimization. The results show that using the optimal node-placement obtained by our proposed technique can improve the positioning error distances up to 20% better than those of the other techniques. The proposed technique can provide an average error distance within 1.42 meters.Keywords: indoor positioning system, optimization system design, multi-floor building, wireless sensor networks
Procedia PDF Downloads 2517408 Research on the Optimization of the Facility Layout of Efficient Cafeterias for Troops
Authors: Qing Zhang, Jiachen Nie, Yujia Wen, Guanyuan Kou, Peng Yu, Kun Xia, Qin Yang, Li Ding
Abstract:
BACKGROUND: A facility layout problem (FLP) is an NP-complete (non-deterministic polynomial) problem, which is hard to obtain an exact optimal solution. FLP has been widely studied in various limited spaces and workflows. For example, cafeterias with many types of equipment for troops cause chaotic processes when dining. OBJECTIVE: This article tried to optimize the layout of troops’ cafeteria and to improve the overall efficiency of the dining process. METHODS: First, the original cafeteria layout design scheme was analyzed from an ergonomic perspective and two new design schemes were generated. Next, three facility layout models were designed, and further simulation was applied to compare the total time and density of troops between each scheme. Last, an experiment of the dining process with video observation and analysis verified the simulation results. RESULTS: In a simulation, the dining time under the second new layout is shortened by 2.25% and 1.89% (p<0.0001, p=0.0001) compared with the other two layouts, while troops-flow density and interference both greatly reduced in the two new layouts. In the experiment, process completing time and the number of interference reduced as well, which verified corresponding simulation results. CONCLUSIONS: Our two new layout schemes are tested to be optimal by a series of simulation and space experiments. In future research, similar approaches could be applied when taking layout-design algorithm calculation into consideration.Keywords: layout optimization, dining efficiency, troops’ cafeteria, anylogic simulation, field experiment
Procedia PDF Downloads 1477407 Factors Influencing the Housing Price: Developers’ Perspective
Authors: Ernawati Mustafa Kamal, Hasnanywati Hassan, Atasya Osmadi
Abstract:
The housing industry is crucial for sustainable development of every country. Housing is a basic need that can enhance the quality of life. Owning a house is therefore the main aim of individuals. However, affordability has become a critical issue towards homeownership. In recent years, housing price in the main cities has increased tremendously to unaffordable level. This paper investigates factors influencing the housing price from developer’s perspective and provides recommendation on strategies to tackle this issue. Online and face-to-face survey was conducted on housing developers operating in Penang, Malaysia. The results indicate that (1) location; (2) macroeconomics factor; (3) demographic factors; (4) land/zoning and; (5) industry factors are the main factors influencing the housing price. This paper contributes towards better understanding on developers’ view on how the housing price is determined and form a basis for government to help tackle the housing affordability issue.Keywords: factors influence, house price, housing developers, Malaysia
Procedia PDF Downloads 3987406 Syndromic Surveillance Framework Using Tweets Data Analytics
Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden
Abstract:
Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza
Procedia PDF Downloads 1217405 Enhancing the Performance of Automatic Logistic Centers by Optimizing the Assignment of Material Flows to Workstations and Flow Racks
Authors: Sharon Hovav, Ilya Levner, Oren Nahum, Istvan Szabo
Abstract:
In modern large-scale logistic centers (e.g., big automated warehouses), complex logistic operations performed by human staff (pickers) need to be coordinated with the operations of automated facilities (robots, conveyors, cranes, lifts, flow racks, etc.). The efficiency of advanced logistic centers strongly depends on optimizing picking technologies in synch with the facility/product layout, as well as on optimal distribution of material flows (products) in the system. The challenge is to develop a mathematical operations research (OR) tool that will optimize system cost-effectiveness. In this work, we propose a model that describes an automatic logistic center consisting of a set of workstations located at several galleries (floors), with each station containing a known number of flow racks. The requirements of each product and the working capacity of stations served by a given set of workers (pickers) are assumed as predetermined. The goal of the model is to maximize system efficiency. The proposed model includes two echelons. The first is the setting of the (optimal) number of workstations needed to create the total processing/logistic system, subject to picker capacities. The second echelon deals with the assignment of the products to the workstations and flow racks, aimed to achieve maximal throughputs of picked products over the entire system given picker capacities and budget constraints. The solutions to the problems at the two echelons interact to balance the overall load in the flow racks and maximize overall efficiency. We have developed an operations research model within each echelon. In the first echelon, the problem of calculating the optimal number of workstations is formulated as a non-standard bin-packing problem with capacity constraints for each bin. The problem arising in the second echelon is presented as a constrained product-workstation-flow rack assignment problem with non-standard mini-max criteria in which the workload maximum is calculated across all workstations in the center and the exterior minimum is calculated across all possible product-workstation-flow rack assignments. The OR problems arising in each echelon are proved to be NP-hard. Consequently, we find and develop heuristic and approximation solution algorithms based on exploiting and improving local optimums. The LC model considered in this work is highly dynamic and is recalculated periodically based on updated demand forecasts that reflect market trends, technological changes, seasonality, and the introduction of new items. The suggested two-echelon approach and the min-max balancing scheme are shown to work effectively on illustrative examples and real-life logistic data.Keywords: logistics center, product-workstation, assignment, maximum performance, load balancing, fast algorithm
Procedia PDF Downloads 2327404 Pineapple Waste Valorization through Biogas Production: Effect of Substrate Concentration and Microwave Pretreatment
Authors: Khamdan Cahyari, Pratikno Hidayat
Abstract:
Indonesia has produced more than 1.8 million ton pineapple fruit in 2013 of which turned into waste due to industrial processing, deterioration and low qualities. It was estimated that this waste accounted for more than 40 percent of harvested fruits. In addition, pineapple leaves were one of biomass waste from pineapple farming land, which contributed even higher percentages. Most of the waste was only dumped into landfill area without proper pretreatment causing severe environmental problem. This research was meant to valorize the pineapple waste for producing renewable energy source of biogas through mesophilic (30℃) anaerobic digestion process. Especially, it was aimed to investigate effect of substrate concentration of pineapple fruit waste i.e. peel, core as well as effect of microwave pretreatment of pineapple leaves waste. The concentration of substrate was set at value 12, 24 and 36 g VS/liter culture whereas 800-Watt microwave pretreatment conducted at 2 and 5 minutes. It was noticed that optimum biogas production obtained at concentration 24 g VS/l with biogas yield 0.649 liter/g VS (45%v CH4) whereas microwave pretreatment at 2 minutes duration performed better compare to 5 minutes due to shorter exposure of microwave heat. This results suggested that valorization of pineapple waste could be carried out through biogas production at the aforementioned process condition. Application of this method is able to both reduce the environmental problem of the waste and produce renewable energy source of biogas to fulfill local energy demand of pineapple farming areas.Keywords: pineapple waste, substrate concentration, microwave pretreatment, biogas, anaerobic digestion
Procedia PDF Downloads 5847403 An Integrated Architecture of E-Learning System to Digitize the Learning Method
Authors: M. Touhidul Islam Sarker, Mohammod Abul Kashem
Abstract:
The purpose of this paper is to improve the e-learning system and digitize the learning method in the educational sector. The learner will login into e-learning platform and easily access the digital content, the content can be downloaded and take an assessment for evaluation. Learner can get access to these digital resources by using tablet, computer, and smart phone also. E-learning system can be defined as teaching and learning with the help of multimedia technologies and the internet by access to digital content. E-learning replacing the traditional education system through information and communication technology-based learning. This paper has designed and implemented integrated e-learning system architecture with University Management System. Moodle (Modular Object-Oriented Dynamic Learning Environment) is the best e-learning system, but the problem of Moodle has no school or university management system. In this research, we have not considered the school’s student because they are out of internet facilities. That’s why we considered the university students because they have the internet access and used technologies. The University Management System has different types of activities such as student registration, account management, teacher information, semester registration, staff information, etc. If we integrated these types of activity or module with Moodle, then we can overcome the problem of Moodle, and it will enhance the e-learning system architecture which makes effective use of technology. This architecture will give the learner to easily access the resources of e-learning platform anytime or anywhere which digitizes the learning method.Keywords: database, e-learning, LMS, Moodle
Procedia PDF Downloads 1917402 The Inverse Problem in the Process of Heat and Moisture Transfer in Multilayer Walling
Authors: Bolatbek Rysbaiuly, Nazerke Rysbayeva, Aigerim Rysbayeva
Abstract:
Relevance: Energy saving elevated to public policy in almost all developed countries. One of the areas for energy efficiency is improving and tightening design standards. In the tie with the state standards, make high demands for thermal protection of buildings. Constructive arrangement of layers should ensure normal operation in which the humidity of materials of construction should not exceed a certain level. Elevated levels of moisture in the walls can be attributed to a defective condition, as moisture significantly reduces the physical, mechanical and thermal properties of materials. Absence at the design stage of modeling the processes occurring in the construction and predict the behavior of structures during their work in the real world leads to an increase in heat loss and premature aging structures. Method: To solve this problem, widely used method of mathematical modeling of heat and mass transfer in materials. The mathematical modeling of heat and mass transfer are taken into the equation interconnected layer [1]. In winter, the thermal and hydraulic conductivity characteristics of the materials are nonlinear and depends on the temperature and moisture in the material. In this case, the experimental method of determining the coefficient of the freezing or thawing of the material becomes much more difficult. Therefore, in this paper we propose an approximate method for calculating the thermal conductivity and moisture permeability characteristics of freezing or thawing material. Questions. Following the development of methods for solving the inverse problem of mathematical modeling allows us to answer questions that are closely related to the rational design of fences: Where the zone of condensation in the body of the multi-layer fencing; How and where to apply insulation rationally his place; Any constructive activities necessary to provide for the removal of moisture from the structure; What should be the temperature and humidity conditions for the normal operation of the premises enclosing structure; What is the longevity of the structure in terms of its components frost materials. Tasks: The proposed mathematical model to solve the following problems: To assess the condition of the thermo-physical designed structures at different operating conditions and select appropriate material layers; Calculate the temperature field in a structurally complex multilayer structures; When measuring temperature and moisture in the characteristic points to determine the thermal characteristics of the materials constituting the surveyed construction; Laboratory testing to significantly reduce test time, and eliminates the climatic chamber and expensive instrumentation experiments and research; Allows you to simulate real-life situations that arise in multilayer enclosing structures associated with freezing, thawing, drying and cooling of any layer of the building material.Keywords: energy saving, inverse problem, heat transfer, multilayer walling
Procedia PDF Downloads 4007401 Culvert Blockage Evaluation Using Australian Rainfall And Runoff 2019
Authors: Rob Leslie, Taher Karimian
Abstract:
The blockage of cross drainage structures is a risk that needs to be understood and managed or lessened through the design. A blockage is a random event, influenced by site-specific factors, which needs to be quantified for design. Under and overestimation of blockage can have major impacts on flood risk and cost associated with drainage structures. The importance of this matter is heightened for those projects located within sensitive lands. It is a particularly complex problem for large linear infrastructure projects (e.g., rail corridors) located within floodplains where blockage factors can influence flooding upstream and downstream of the infrastructure. The selection of the appropriate blockage factors for hydraulic modeling has been subject to extensive research by hydraulic engineers. This paper has been prepared to review the current Australian Rainfall and Runoff 2019 (ARR 2019) methodology for blockage assessment by applying this method to a transport corridor brownfield upgrade case study in New South Wales. The results of applying the method are also validated against asset data and maintenance records. ARR 2019 – Book 6, Chapter 6 includes advice and an approach for estimating the blockage of bridges and culverts. This paper concentrates specifically on the blockage of cross drainage structures. The method has been developed to estimate the blockage level for culverts affected by sediment or debris due to flooding. The objective of the approach is to evaluate a numerical blockage factor that can be utilized in a hydraulic assessment of cross drainage structures. The project included an assessment of over 200 cross drainage structures. In order to estimate a blockage factor for use in the hydraulic model, a process has been advanced that considers the qualitative factors (e.g., Debris type, debris availability) and site-specific hydraulic factors that influence blockage. A site rating associated with the debris potential (i.e., availability, transportability, mobility) at each crossing was completed using the method outlined in ARR 2019 guidelines. The hydraulic results inputs (i.e., flow velocity, flow depth) and qualitative factors at each crossing were developed into an advanced spreadsheet where the design blockage level for cross drainage structures were determined based on the condition relating Inlet Clear Width and L10 (average length of the longest 10% of the debris reaching the site) and the Adjusted Debris Potential. Asset data, including site photos and maintenance records, were then reviewed and compared with the blockage assessment to check the validity of the results. The results of this assessment demonstrate that the estimated blockage factors at each crossing location using ARR 2019 guidelines are well-validated with the asset data. The primary finding of the study is that the ARR 2019 methodology is a suitable approach for culvert blockage assessment that has been validated against a case study spanning a large geographical area and multiple sub-catchments. The study also found that the methodology can be effectively coded within a spreadsheet or similar analytical tool to automate its application.Keywords: ARR 2019, blockage, culverts, methodology
Procedia PDF Downloads 3737400 Geographic Information Systems and a Breath of Opportunities for Supply Chain Management: Results from a Systematic Literature Review
Authors: Anastasia Tsakiridi
Abstract:
Geographic information systems (GIS) have been utilized in numerous spatial problems, such as site research, land suitability, and demographic analysis. Besides, GIS has been applied in scientific fields like geography, health, and economics. In business studies, GIS has been used to provide insights and spatial perspectives in demographic trends, spending indicators, and network analysis. To date, the information regarding the available usages of GIS in supply chain management (SCM) and how these analyses can benefit businesses is limited. A systematic literature review (SLR) of the last 5-year peer-reviewed academic literature was conducted, aiming to explore the existing usages of GIS in SCM. The searches were performed in 3 databases (Web of Science, ProQuest, and Business Source Premier) and reported using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology. The analysis resulted in 79 papers. The results indicate that the existing GIS applications used in SCM were in the following domains: a) network/ transportation analysis (in 53 of the papers), b) location – allocation site search/ selection (multiple-criteria decision analysis) (in 45 papers), c) spatial analysis (demographic or physical) (in 34 papers), d) combination of GIS and supply chain/network optimization tools (in 32 papers), and e) visualization/ monitoring or building information modeling applications (in 8 papers). An additional categorization of the literature was conducted by examining the usage of GIS in the supply chain (SC) by the business sectors, as indicated by the volume of the papers. The results showed that GIS is mainly being applied in the SC of the biomass biofuel/wood industry (33 papers). Other industries that are currently utilizing GIS in their SC were the logistics industry (22 papers), the humanitarian/emergency/health care sector (10 papers), the food/agro-industry sector (5 papers), the petroleum/ coal/ shale gas sector (3 papers), the faecal sludge sector (2 papers), the recycle and product footprint industry (2 papers), and the construction sector (2 papers). The results were also presented by the geography of the included studies and the GIS software used to provide critical business insights and suggestions for future research. The results showed that research case studies of GIS in SCM were conducted in 26 countries (mainly in the USA) and that the most prominent GIS software provider was the Environmental Systems Research Institute’s ArcGIS (in 51 of the papers). This study is a systematic literature review of the usage of GIS in SCM. The results showed that the GIS capabilities could offer substantial benefits in SCM decision-making by providing key insights to cost minimization, supplier selection, facility location, SC network configuration, and asset management. However, as presented in the results, only eight industries/sectors are currently using GIS in their SCM activities. These findings may offer essential tools to SC managers who seek to optimize the SC activities and/or minimize logistic costs and to consultants and business owners that want to make strategic SC decisions. Furthermore, the findings may be of interest to researchers aiming to investigate unexplored research areas where GIS may improve SCM.Keywords: supply chain management, logistics, systematic literature review, GIS
Procedia PDF Downloads 1457399 LEDs Based Indoor Positioning by Distances Derivation from Lambertian Illumination Model
Authors: Yan-Ren Chen, Jenn-Kaie Lain
Abstract:
This paper proposes a novel indoor positioning algorithm based on visible light communications, implemented by light-emitting diode fixtures. In the proposed positioning algorithm, distances between light-emitting diode fixtures and mobile terminal are derived from the assumption of ideal Lambertian optic radiation model, and Trilateration positioning method is proceeded immediately to get the coordinates of mobile terminal. The proposed positioning algorithm directly obtains distance information from the optical signal modeling, and therefore, statistical distribution of received signal strength at different positions in interior space has no need to be pre-established. Numerically, simulation results have shown that the proposed indoor positioning algorithm can provide accurate location coordinates estimation.Keywords: indoor positioning, received signal strength, trilateration, visible light communications
Procedia PDF Downloads 4167398 Physics-Informed Machine Learning for Displacement Estimation in Solid Mechanics Problem
Authors: Feng Yang
Abstract:
Machine learning (ML), especially deep learning (DL), has been extensively applied to many applications in recently years and gained great success in solving different problems, including scientific problems. However, conventional ML/DL methodologies are purely data-driven which have the limitations, such as need of ample amount of labelled training data, lack of consistency to physical principles, and lack of generalizability to new problems/domains. Recently, there is a growing consensus that ML models need to further take advantage of prior knowledge to deal with these limitations. Physics-informed machine learning, aiming at integration of physics/domain knowledge into ML, has been recognized as an emerging area of research, especially in the recent 2 to 3 years. In this work, physics-informed ML, specifically physics-informed neural network (NN), is employed and implemented to estimate the displacements at x, y, z directions in a solid mechanics problem that is controlled by equilibrium equations with boundary conditions. By incorporating the physics (i.e. the equilibrium equations) into the learning process of NN, it is showed that the NN can be trained very efficiently with a small set of labelled training data. Experiments with different settings of the NN model and the amount of labelled training data were conducted, and the results show that very high accuracy can be achieved in fulfilling the equilibrium equations as well as in predicting the displacements, e.g. in setting the overall displacement of 0.1, a root mean square error (RMSE) of 2.09 × 10−4 was achieved.Keywords: deep learning, neural network, physics-informed machine learning, solid mechanics
Procedia PDF Downloads 1547397 Finite Element Modeling and Nonlinear Analysis for Seismic Assessment of Off-Diagonal Steel Braced RC Frame
Authors: Keyvan Ramin
Abstract:
The geometric nonlinearity of Off-Diagonal Bracing System (ODBS) could be a complementary system to covering and extending the nonlinearity of reinforced concrete material. Finite element modeling is performed for flexural frame, x-braced frame and the ODBS braced frame system at the initial phase. Then the different models are investigated along various analyses. According to the experimental results of flexural and x-braced frame, the verification is done. Analytical assessments are performed in according to three-dimensional finite element modeling. Non-linear static analysis is considered to obtain performance level and seismic behavior, and then the response modification factors calculated from each model’s pushover curve. In the next phase, the evaluation of cracks observed in the finite element models, especially for RC members of all three systems is performed. The finite element assessment is performed on engendered cracks in ODBS braced frame for various time steps. The nonlinear dynamic time history analysis accomplished in different stories models for three records of Elcentro, Naghan, and Tabas earthquake accelerograms. Dynamic analysis is performed after scaling accelerogram on each type of flexural frame, x-braced frame and ODBS braced frame one by one. The base-point on RC frame is considered to investigate proportional displacement under each record. Hysteresis curves are assessed along continuing this study. The equivalent viscous damping for ODBS system is estimated in according to references. Results in each section show the ODBS system has an acceptable seismic behavior and their conclusions have been converged when the ODBS system is utilized in reinforced concrete frame.Keywords: FEM, seismic behaviour, pushover analysis, geometric nonlinearity, time history analysis, equivalent viscous damping, passive control, crack investigation, hysteresis curve
Procedia PDF Downloads 3807396 Indian Art Education and Career Opportunities: A Critical Analysis on Commercial Art
Authors: Pooja Jain
Abstract:
Art education is often ignored in syllabus of developing countries like India and in educational planning for development but now days Indian Art with a global recognition is becoming an integral part of the education at all levels. The term art, widely used in all parts of the modern world, carried varied significance in India as its meaning was continuously being extended, covering the many varieties of creative expression such as painting, sculpture, commercial art, design, poetry, music, dance, and architecture. Over the last 100 years Indian artists of all forms have evolved a wide variety of expressive styles. With the recommendations and initiatives by Government of India, Art Education has subsequently gained pace at the school level as a mandatory subject for all making a path way for students with a creative bend of mind. This paper investigates curriculum in various schools of the country at secondary and senior secondary levels along with some eminent institutions running the program. Findings depicted the role of art education and justified its importance primarily with commercial art being perceived to be essential for students learning skills for economic gain in their career ahead. With so many art colleges spread across India, emerging artists and designers are being trained and are creating art of infinite variety and style and have opened up many career avenues. Commercial Art being a plethora of artistic expressions has confidently come of age wherein a creative perception is mixed with an introspective imagination to bring out multi faceted career options with a significant future enveloped in art. Visual arts in education thus is an expanding field of result assured research.
Keywords: modern art, commercial art, introspective imagination, career
Procedia PDF Downloads 1907395 Small Entrepreneurs as Creators of Chaos: Increasing Returns Requires Scaling
Authors: M. B. Neace, Xin GAo
Abstract:
Small entrepreneurs are ubiquitous. Regardless of location their success depends on several behavioral characteristics and several market conditions. In this concept paper, we extend this paradigm to include elements from the science of chaos. Our observations, research findings, literature search and intuition lead us to the proposition that all entrepreneurs seek increasing returns, as did the many small entrepreneurs we have interviewed over the years. There will be a few whose initial perturbations may create tsunami-like waves of increasing returns over time resulting in very large market consequences–the butterfly impact. When small entrepreneurs perturb the market-place and their initial efforts take root a series of phase-space transitions begin to occur. They sustain the stream of increasing returns by scaling up. Chaos theory contributes to our understanding of this phenomenon. Sustaining and nourishing increasing returns of small entrepreneurs as complex adaptive systems requires scaling. In this paper we focus on the most critical element of the small entrepreneur scaling process–the mindset of the owner-operator.Keywords: entrepreneur, increasing returns, scaling, chaos
Procedia PDF Downloads 4617394 Present and Future Climate Extreme Indices over Sinai Peninsula, Egypt
Authors: Mahmoud Roushdi, Hany Mostafa, Khaled Kheireldin
Abstract:
Sinai Peninsula and Suez Canal Corridor are promising and important economic regions in Egypt due to the unique location and development opportunities. Thus, the climate change impacts should be assessed over the mentioned area. Accordingly, this paper aims to assess the climate extreme indices in through the last 35 year over Sinai Peninsula and Suez Canal Corridor in addition to predict the climate extreme indices up to 2100. Present and future climate indices were analyzed with using different RCP scenarios 4.5 and 8.5 from 2010 until 2100 for Sinai Peninsula and Suez Canal Corridor. Furthermore, both CanESM and HadGEM2 global circulation models were used. The results indicate that the number of summer days is predicted to increase, on the other hand the frost days is predicted to decrease. Moreover, it is noted a slight positive trend for the percentile of wet and extremely days R95p and R99p for RCP4.5 and negative trend for RCP8.5.Keywords: climate change, extreme indices, RCP, Sinai Peninsula
Procedia PDF Downloads 4407393 Climate Change Law and Transnational Corporations
Authors: Manuel Jose Oyson
Abstract:
The Intergovernmental Panel on Climate Change (IPCC) warned in its most recent report for the entire world “to both mitigate and adapt to climate change if it is to effectively avoid harmful climate impacts.” The IPCC observed “with high confidence” a more rapid rise in total anthropogenic greenhouse gas emissions (GHG) emissions from 2000 to 2010 than in the past three decades that “were the highest in human history”, which if left unchecked will entail a continuing process of global warming and can alter the climate system. Current efforts, however, to respond to the threat of global warming, such as the United Nations Framework Convention on Climate Change and the Kyoto Protocol, have focused on states, and fail to involve Transnational Corporations (TNCs) which are responsible for a vast amount of GHG emissions. Involving TNCs in the search for solutions to climate change is consistent with an acknowledgment by contemporary international law that there is an international role for other international persons, including TNCs, and departs from the traditional “state-centric” response to climate change. Putting the focus of GHG emissions away from states recognises that the activities of TNCs “are not bound by national borders” and that the international movement of goods meets the needs of consumers worldwide. Although there is no legally-binding instrument that covers TNC activities or legal responsibilities generally, TNCs have increasingly been made legally responsible under international law for violations of human rights, exploitation of workers and environmental damage, but not for climate change damage. Imposing on TNCs a legally-binding obligation to reduce their GHG emissions or a legal liability for climate change damage is arguably formidable and unlikely in the absence a recognisable source of obligation in international law or municipal law. Instead a recourse to “soft law” and non-legally binding instruments may be a way forward for TNCs to reduce their GHG emissions and help in addressing climate change. Positive effects have been noted by various studies to voluntary approaches. TNCs have also in recent decades voluntarily committed to “soft law” international agreements. This development reflects a growing recognition among corporations in general and TNCs in particular of their corporate social responsibility (CSR). While CSR used to be the domain of “small, offbeat companies”, it has now become part of mainstream organization. The paper argues that TNCs must voluntarily commit to reducing their GHG emissions and helping address climate change as part of their CSR. One, as a serious “global commons problem”, climate change requires international cooperation from multiple actors, including TNCs. Two, TNCs are not innocent bystanders but are responsible for a large part of GHG emissions across their vast global operations. Three, TNCs have the capability to help solve the problem of climate change. Assuming arguendo that TNCs did not strongly contribute to the problem of climate change, society would have valid expectations for them to use their capabilities, knowledge-base and advanced technologies to help address the problem. It would seem unthinkable for TNCs to do nothing while the global environment fractures.Keywords: climate change law, corporate social responsibility, greenhouse gas emissions, transnational corporations
Procedia PDF Downloads 3527392 Variable Mapping: From Bibliometrics to Implications
Authors: Przemysław Tomczyk, Dagmara Plata-Alf, Piotr Kwiatek
Abstract:
Literature review is indispensable in research. One of the key techniques used in it is bibliometric analysis, where one of the methods is science mapping. The classic approach that dominates today in this area consists of mapping areas, keywords, terms, authors, or citations. This approach is also used in relation to the review of literature in the field of marketing. The development of technology has resulted in the fact that researchers and practitioners use the capabilities of software available on the market for this purpose. The use of science mapping software tools (e.g., VOSviewer, SciMAT, Pajek) in recent publications involves the implementation of a literature review, and it is useful in areas with a relatively high number of publications. Despite this well-grounded science mapping approach having been applied in the literature reviews, performing them is a painstaking task, especially if authors would like to draw precise conclusions about the studied literature and uncover potential research gaps. The aim of this article is to identify to what extent a new approach to science mapping, variable mapping, takes advantage of the classic science mapping approach in terms of research problem formulation and content/thematic analysis for literature reviews. To perform the analysis, a set of 5 articles on customer ideation was chosen. Next, the analysis of key words mapping results in VOSviewer science mapping software was performed and compared with the variable map prepared manually on the same articles. Seven independent expert judges (management scientists on different levels of expertise) assessed the usability of both the stage of formulating, the research problem, and content/thematic analysis. The results show the advantage of variable mapping in the formulation of the research problem and thematic/content analysis. First, the ability to identify a research gap is clearly visible due to the transparent and comprehensive analysis of the relationships between the variables, not only keywords. Second, the analysis of relationships between variables enables the creation of a story with an indication of the directions of relationships between variables. Demonstrating the advantage of the new approach over the classic one may be a significant step towards developing a new approach to the synthesis of literature and its reviews. Variable mapping seems to allow scientists to build clear and effective models presenting the scientific achievements of a chosen research area in one simple map. Additionally, the development of the software enabling the automation of the variable mapping process on large data sets may be a breakthrough change in the field of conducting literature research.Keywords: bibliometrics, literature review, science mapping, variable mapping
Procedia PDF Downloads 1257391 Structural Equation Modelling Based Approach to Integrate Customers and Suppliers with Internal Practices for Lean Manufacturing Implementation in the Indian Context
Authors: Protik Basu, Indranil Ghosh, Pranab K. Dan
Abstract:
Lean management is an integrated socio-technical system to bring about a competitive state in an organization. The purpose of this paper is to explore and integrate the role of customers and suppliers with the internal practices of the Indian manufacturing industries towards successful implementation of lean manufacturing (LM). An extensive literature survey is carried out. An attempt is made to build an exhaustive list of all the input manifests related to customers, suppliers and internal practices necessary for LM implementation, coupled with a similar exhaustive list of the benefits accrued from its successful implementation. A structural model is thus conceptualized, which is empirically validated based on the data from the Indian manufacturing sector. With the current impetus on developing the industrial sector, the Government of India recently introduced the Lean Manufacturing Competitiveness Scheme that aims to increase competitiveness with the help of lean concepts. There is a huge scope to enrich the Indian industries with the lean benefits, the implementation status being quite low. Hardly any survey-based empirical study in India has been found to integrate customers and suppliers with the internal processes towards successful LM implementation. This empirical research is thus carried out in the Indian manufacturing industries. The basic steps of the research methodology followed in this research are the identification of input and output manifest variables and latent constructs, model proposition and hypotheses development, development of survey instrument, sampling and data collection and model validation (exploratory factor analysis, confirmatory factor analysis, and structural equation modeling). The analysis reveals six key input constructs and three output constructs, indicating that these constructs should act in unison to maximize the benefits of implementing lean. The structural model presented in this paper may be treated as a guide to integrating customers and suppliers with internal practices to successfully implement lean. Integrating customers and suppliers with internal practices into a unified, coherent manufacturing system will lead to an optimum utilization of resources. This work is one of the very first researches to have a survey-based empirical analysis of the role of customers, suppliers and internal practices of the Indian manufacturing sector towards an effective lean implementation.Keywords: customer management, internal manufacturing practices, lean benefits, lean implementation, lean manufacturing, structural model, supplier management
Procedia PDF Downloads 1817390 Classification Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach
Authors: H. J. Wattimanela, U. S. Passaribu, N. T. Puspito, S. W. Indratno
Abstract:
Banda Sea collision zone (BSCZ) of is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location in the eastern part of Indonesia. This zone has a very high seismic activity. In this research, we will be calculated rate (λ) and Mean Square Eror (MSE). By this result, we will identification of Poisson distribution of earthquakes in the BSCZ with the point process approach. Chi-square test approach and test Anscombe made in the process of identifying a Poisson distribution in the partition area. The data used are earthquakes with Magnitude ≥ 6 SR and its period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.Keywords: molluca banda sea collision zone, earthquakes, mean square error, poisson distribution, chi-square test, anscombe test
Procedia PDF Downloads 3067389 Challenges in the Construction of a 6M Diameter and 1.6km Long Tunnel Under Crossing a Channel in the West of Singapore
Authors: David Loh, Wan Chee Wai, Pei Nan, Chen Zhe
Abstract:
To increase the conveyance capacity to Western Singapore and to meet Singapore’s long-term water needs in a more cost-effective manner, four new transmission pipelines consisting of two 2200 mm diameter water pipes and two 1200mm diameter water pipes will be needed by 2024 to convey water from a Water Reclamation Plant to existing networks in the west region of Singapore. Out of the several possible routes studied, the most cost-effective and technically feasible route was selected to lay the proposed 1.6km-long pipelines that cross a channel via a 6m diameter subsea tunnel. This paper outlines the challenges the team faced throughout the project thus far. It also examined the difficulties such as (1) construction of a 56m-deep launching shaft near a highly sensitive 700mm diameter Gas Transmission Pipeline (GTP) and at a location with high groundwater; (2) manpower and supply disruptions caused by the COVID-19 pandemic situation.Keywords: underwater tunnel, subsea engineering, subsea tunnel construction, waterpipe construction
Procedia PDF Downloads 327388 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context
Authors: Mohamed Boullouz, Mohamed Louay Metougui
Abstract:
Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems
Procedia PDF Downloads 697387 Stream Channel Changes in Balingara River, Sulawesi Tengah
Authors: Muhardiyan Erawan, Zaenal Mutaqin
Abstract:
Balingara River is one of the rivers with the type Gravel-Bed in Indonesia. Gravel-Bed Rivers easily deformed in a relatively short time due to several variables, that are climate (rainfall), river discharge, topography, rock types, and land cover. To determine stream channel changes in Balingara River used Landsat 7 and 8 and analyzed planimetric or two dimensions. Parameters to determine changes in the stream channel are sinuosity ratio, Brice Index, the extent of erosion and deposition. Changes in stream channel associated with changes in land cover then analyze with a descriptive analysis of spatial and temporal. The location of a stream channel has a low gradient in the upstream, and middle watershed with the type of rock in the form of gravel is more easily changed than other locations. Changes in the area of erosion and deposition influence the land cover changes.Keywords: Brice Index, erosion, deposition, gravel-bed, land cover change, sinuosity ratio, stream channel change
Procedia PDF Downloads 332