Search results for: enriched semantic event chain
3307 A Comparative Analysis of Geometric and Exponential Laws in Modelling the Distribution of the Duration of Daily Precipitation
Authors: Mounia El Hafyani, Khalid El Himdi
Abstract:
Precipitation is one of the key variables in water resource planning. The importance of modeling wet and dry durations is a crucial pointer in engineering hydrology. The objective of this study is to model and analyze the distribution of wet and dry durations. For this purpose, the daily rainfall data from 1967 to 2017 of the Moroccan city of Kenitra’s station are used. Three models are implemented for the distribution of wet and dry durations, namely the first-order Markov chain, the second-order Markov chain, and the truncated negative binomial law. The adherence of the data to the proposed models is evaluated using Chi-square and Kolmogorov-Smirnov tests. The Akaike information criterion is applied to assess the most effective model distribution. We go further and study the law of the number of wet and dry days among k consecutive days. The calculation of this law is done through an algorithm that we have implemented based on conditional laws. We complete our work by comparing the observed moments of the numbers of wet/dry days among k consecutive days to the calculated moment of the three estimated models. The study shows the effectiveness of our approach in modeling wet and dry durations of daily precipitation.Keywords: Markov chain, rainfall, truncated negative binomial law, wet and dry durations
Procedia PDF Downloads 1263306 Taking the Whole Picture to Your Supply Chain; Customers Will Take Selfies When Expectations Are Met
Authors: Marcelo Sifuentes López
Abstract:
Strategic performance definition and follow-up processes have to be clear in order to provide value in today’s competitive world. Customer expectations must be linked to internal organization strategic objectives leading to profitability and supported by visibility and flexibility among others.By taking a whole picture of the supply chain, the executive, and its team will define the current supply chain situation and an insight into potential opportunities to improve processes and provide value to main stakeholders. A systematic performance evaluation process based on operational and financial indicators defined by customer requirements needs to be implemented and periodically reviewed in order to mitigate costs and risks on time.Supplier long term relationship and collaboration plays a key role using resources available, real-time communication, innovation and new ways to capitalize global opportunities like emerging markets; efforts have to focus on the reduction of uncertainties in supply and demand. Leadership has to promote consistency of communication and execution involving suppliers, customers, and the entire organization through the support of a strategic sourcing methodology that assure the targeted competitive strategy and sustainable growth. As customer requirements and expectations are met, results could be captured in a casual picture like a “selfie”; where outcomes could be perceived from any desired angle by them; or like most “selfies”, can be taken with a camera held at arm's length by a third party company rather than using a self-timer.Keywords: supply chain management, competitive advantage, value creation, collaboration and innovation, global marketplace
Procedia PDF Downloads 4423305 Energy Content and Spectral Energy Representation of Wave Propagation in a Granular Chain
Authors: Rohit Shrivastava, Stefan Luding
Abstract:
A mechanical wave is propagation of vibration with transfer of energy and momentum. Studying the energy as well as spectral energy characteristics of a propagating wave through disordered granular media can assist in understanding the overall properties of wave propagation through inhomogeneous materials like soil. The study of these properties is aimed at modeling wave propagation for oil, mineral or gas exploration (seismic prospecting) or non-destructive testing for the study of internal structure of solids. The study of Energy content (Kinetic, Potential and Total Energy) of a pulse propagating through an idealized one-dimensional discrete particle system like a mass disordered granular chain can assist in understanding the energy attenuation due to disorder as a function of propagation distance. The spectral analysis of the energy signal can assist in understanding dispersion as well as attenuation due to scattering in different frequencies (scattering attenuation). The selection of one-dimensional granular chain also helps in studying only the P-wave attributes of the wave and removing the influence of shear or rotational waves. Granular chains with different mass distributions have been studied, by randomly selecting masses from normal, binary and uniform distributions and the standard deviation of the distribution is considered as the disorder parameter, higher standard deviation means higher disorder and lower standard deviation means lower disorder. For obtaining macroscopic/continuum properties, ensemble averaging has been used. Interpreting information from a Total Energy signal turned out to be much easier in comparison to displacement, velocity or acceleration signals of the wave, hence, indicating a better analysis method for wave propagation through granular materials. Increasing disorder leads to faster attenuation of the signal and decreases the Energy of higher frequency signals transmitted, but at the same time the energy of spatially localized high frequencies also increases. An ordered granular chain exhibits ballistic propagation of energy whereas, a disordered granular chain exhibits diffusive like propagation, which eventually becomes localized at long periods of time.Keywords: discrete elements, energy attenuation, mass disorder, granular chain, spectral energy, wave propagation
Procedia PDF Downloads 2923304 Determination of Physicochemical Properties, Bioaccessibility of Phenolics and Antioxidant Capacity of Mineral Enriched Linden Herbal Tea Beverage
Authors: Senem Suna, Canan Ece Tamer, Ömer Utku Çopur
Abstract:
In this research, dried linden (Tilia argentea) leaves and blossoms were used as a raw material for mineral enriched herbal tea beverage production. For this aim, %1 dried linden was infused with boiling water (100 °C) for 5 minutes. After cooling, sucrose, citric acid, ascorbic acid, natural lemon flavor and natural mineral water were added. Beverage samples were plate filtered, filled into 200-mL glass bottles, capped then pasteurized at 98 °C for 15 minutes. Water soluble dry matter, titratable acidity, ascorbic acid, pH, minerals (Fe, Ca, Mg, K, Na), color (L*, a*, b*), turbidity, bioaccessible phenolics and antioxidant capacity were analyzed. Water soluble dry matter, titratable acidity, and ascorbic were determined as 7.66±0.28 g/100 g, 0.13±0.00 g/100 mL, and 19.42±0.62 mg/100 mL, respectively. pH was measured as 3.69. Fe, Ca, Mg, K and Na contents of the beverage were determined as 0.12±0.00, 115.48±0.05, 34.72±0.14, 48.67±0.43 and 85.72±1.01 mg/L, respectively. Color was measured as 13.63±0.05, -4.33±0.05, and 3.06±0.05 for L*, a*, and b* values. Turbidity was determined as 0.69±0.07 NTU. Bioaccessible phenolics were determined as 312.82±5.91 mg GAE/100 mL. Antioxidant capacities of chemical (MetOH:H2O:HCl) and physiological extracts (in vitro digestive enzymatic extraction) with DPPH (27.59±0.53 and 0.17±0.02 μmol trolox/mL), FRAP (21.01±0.97 and 13.27±0.19 μmol trolox/mL) and CUPRAC (44.71±9.42 and 2.80±0.64 μmol trolox/mL) methods were also evaluated. As a result, enrichment with natural mineral water was proposed for the development of functional and nutritional values together with a good potential for commercialization.Keywords: linden, herbal tea beverage, bioaccessibility, antioxidant capacity
Procedia PDF Downloads 1763303 Factor Analysis Based on Semantic Differential of the Public Perception of Public Art: A Case Study of the Malaysia National Monument
Authors: Yuhanis Ibrahim, Sung-Pil Lee
Abstract:
This study attempts to address factors that contribute to outline public art factors assessment, memorial monument specifically. Memorial monuments hold significant and rich message whether the intention of the art is to mark and commemorate important event or to inform younger generation about the past. Public monument should relate to the public and raise awareness about the significant issue. Therefore, by investigating the impact of the existing public memorial art will hopefully shed some lights to the upcoming public art projects’ stakeholders to ensure the lucid memorial message is delivered to the public directly. Public is the main actor as public is the fundamental purpose that the art was created. Perception is framed as one of the reliable evaluation tools to assess the public art impact factors. The Malaysia National Monument was selected to be the case study for the investigation. The public’s perceptions were gathered using a questionnaire that involved (n-115) participants to attain keywords, and next Semantical Differential Methodology (SDM) was adopted to evaluate the perceptions about the memorial monument. These perceptions were then measured with Reliability Factor and then were factorised using Factor Analysis of Principal Component Analysis (PCA) method to acquire concise factors for the monument assessment. The result revealed that there are four factors that influence public’s perception on the monument which are aesthetic, audience, topology, and public reception. The study concludes by proposing the factors for public memorial art assessment for the next future public memorial projects especially in Malaysia.Keywords: factor analysis, public art, public perception, semantical differential methodology
Procedia PDF Downloads 5023302 Adoption of Proactive and Reactive Supply Chain Resilience Strategies: A Comparison between Apparel and Construction Industries in Sri Lanka
Authors: Anuradha Ranawakage, Chathurani Silva
Abstract:
With the growing expansion of global businesses, supply chains are increasingly exposed to numerous disruptions. Organizations adopt various strategies to mitigate the impact of these disruptions. Depending on the variations in the conditions and characteristics of supply chains, the adoption of resilience strategies may vary across industries. However, these differences are largely unexplored in the existing literature. Hence, this study aims to evaluate the adoption of three proactive strategies: proactive collaboration, digital connectivity, integrated SC risk management, and three reactive strategies: reactive collaboration, inventory and reserve capacity, and lifeline maintenance in the apparel and construction industries in Sri Lanka. An online questionnaire was used to collect data on the implementation of resilience strategies from a sample of 162 apparel and 185 construction companies operating in Sri Lanka. This research makes a significant contribution to the field of supply chain management by assessing the extent to which different resilience strategies are functioned within the apparel and construction industries in Sri Lanka, particularly in an era after a global pandemic that significantly disrupted supply chains all around the world.Keywords: apparel, construction, proactive strategies, reactive strategies, supply chain resilience
Procedia PDF Downloads 583301 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference
Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira
Abstract:
Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.Keywords: operational risk, loss distribution approach, extreme value theory, copulas
Procedia PDF Downloads 6043300 A Robust Visual Simultaneous Localization and Mapping for Indoor Dynamic Environment
Authors: Xiang Zhang, Daohong Yang, Ziyuan Wu, Lei Li, Wanting Zhou
Abstract:
Visual Simultaneous Localization and Mapping (VSLAM) uses cameras to collect information in unknown environments to realize simultaneous localization and environment map construction, which has a wide range of applications in autonomous driving, virtual reality and other related fields. At present, the related research achievements about VSLAM can maintain high accuracy in static environment. But in dynamic environment, due to the presence of moving objects in the scene, the movement of these objects will reduce the stability of VSLAM system, resulting in inaccurate localization and mapping, or even failure. In this paper, a robust VSLAM method was proposed to effectively deal with the problem in dynamic environment. We proposed a dynamic region removal scheme based on semantic segmentation neural networks and geometric constraints. Firstly, semantic extraction neural network is used to extract prior active motion region, prior static region and prior passive motion region in the environment. Then, the light weight frame tracking module initializes the transform pose between the previous frame and the current frame on the prior static region. A motion consistency detection module based on multi-view geometry and scene flow is used to divide the environment into static region and dynamic region. Thus, the dynamic object region was successfully eliminated. Finally, only the static region is used for tracking thread. Our research is based on the ORBSLAM3 system, which is one of the most effective VSLAM systems available. We evaluated our method on the TUM RGB-D benchmark and the results demonstrate that the proposed VSLAM method improves the accuracy of the original ORBSLAM3 by 70%˜98.5% under high dynamic environment.Keywords: dynamic scene, dynamic visual SLAM, semantic segmentation, scene flow, VSLAM
Procedia PDF Downloads 1183299 Discovering Event Outliers for Drug as Commercial Products
Authors: Arunas Burinskas, Aurelija Burinskiene
Abstract:
On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.Keywords: drugs, Grubbs' test, outlier, shortage event
Procedia PDF Downloads 1353298 Agile Implementation of 'PULL' Principles in a Manufacturing Process Chain for Aerospace Composite Parts
Authors: Torsten Mielitz, Dietmar Schulz, York C. Roth
Abstract:
Market forecasts show a significant increase in the demand for aircraft within the next two decades and production rates will be adapted accordingly. Improvements and optimizations in the industrial system are becoming more important to cope with future challenges in manufacturing and assembly. Highest quality standards have to be met for aerospace parts, whereas cost effective production in industrial systems and methodologies are also a key driver. A look at other industries like e.g., automotive shows well established processes to streamline existing manufacturing systems. In this paper, the implementation of 'PULL' principles in an existing manufacturing process chain for a large scale composite part is presented. A nonlinear extrapolation based on 'Little's Law' showed a risk of a significant increase of parts needed in the process chain to meet future demand. A project has been set up to mitigate the risk whereas the methodology has been changed from a traditional milestone approach in the beginning towards an agile way of working in the end in order to facilitate immediate benefits in the shop-floor. Finally, delivery rates could be increased avoiding more semi-finished parts in the process chain (work in progress & inventory) by the successful implementation of the 'PULL' philosophy in the shop-floor between the work stations. Lessons learned during the running project as well as implementation and operations phases are discussed in order to share best practices.Keywords: aerospace composite part manufacturing, PULL principles, shop-floor implementation, lessons learned
Procedia PDF Downloads 1743297 Use of Logistics for Demand Control in a Commercial Establishment in Rio De Janeiro, Brazil
Authors: Carlos Fontanillas
Abstract:
Brazil is going through a real revolution in the logistics area. It is increasingly common to find articles and news in this context, as companies begin to become aware that a good management of the areas that make up the logistics can bring excellent results in reducing costs and increasing productivity. With this, companies are investing more emphasis on reduced spending on storage and transport of their products to ensure competitiveness. The scope of this work is the analysis of the logistics of a restaurant and materials will be presented the best way to serve the customer, avoiding the interruption of production due to lack of materials; for it will be analyzed the supply chain in terms of acquisition costs, maintenance and service demand.Keywords: ABC curve, logistic, productivity, supply chain
Procedia PDF Downloads 3153296 Sensing to Respond & Recover in Emergency
Authors: Alok Kumar, Raviraj Patil
Abstract:
The ability to respond to an incident of a disastrous event in a vulnerable area is very crucial an aspect of emergency management. The ability to constantly predict the likelihood of an event along with its severity in an area and react to those significant events which are likely to have a high impact allows the authorities to respond by allocating resources optimally in a timely manner. It provides for measuring, monitoring, and modeling facilities that integrate underlying systems into one solution to improve operational efficiency, planning, and coordination. We were particularly involved in this innovative incubation work on the current state of research and development in collaboration. technologies & systems for a disaster.Keywords: predictive analytics, advanced analytics, area flood likelihood model, area flood severity model, level of impact model, mortality score, economic loss score, resource allocation, crew allocation
Procedia PDF Downloads 3213295 Tide Contribution in the Flood Event of Jeddah City: Mathematical Modelling and Different Field Measurements of the Groundwater Rise
Authors: Aïssa Rezzoug
Abstract:
This paper is aimed to bring new elements that demonstrate the tide caused the groundwater to rise in the shoreline band, on which the urban areas occurs, especially in the western coastal cities of the Kingdom of Saudi Arabia like Jeddah. The reason for the last events of Jeddah inundation was the groundwater rise in the city coupled at the same time to a strong precipitation event. This paper will illustrate the tide participation in increasing the groundwater level significantly. It shows that the reason for internal groundwater recharge within the urban area is not only the excess of the water supply coming from surrounding areas, due to the human activity, with lack of sufficient and efficient sewage system, but also due to tide effect. The research study follows a quantitative method to assess groundwater level rise risks through many in-situ measurements and mathematical modelling. The proposed approach highlights groundwater level, in the urban areas of the city on the shoreline band, reaching the high tide level without considering any input from precipitation. Despite the small tide in the Red Sea compared to other oceanic coasts, the groundwater level is considerably enhanced by the tide from the seaside and by the freshwater table from the landside of the city. In these conditions, the groundwater level becomes high in the city and prevents the soil to evacuate quickly enough the surface flow caused by the storm event, as it was observed in the last historical flood catastrophe of Jeddah in 2009.Keywords: flood, groundwater rise, Jeddah, tide
Procedia PDF Downloads 1153294 Improving the Technology of Assembly by Use of Computer Calculations
Authors: Mariya V. Yanyukina, Michael A. Bolotov
Abstract:
Assembling accuracy is the degree of accordance between the actual values of the parameters obtained during assembly, and the values specified in the assembly drawings and technical specifications. However, the assembling accuracy depends not only on the quality of the production process but also on the correctness of the assembly process. Therefore, preliminary calculations of assembly stages are carried out to verify the correspondence of real geometric parameters to their acceptable values. In the aviation industry, most calculations involve interacting dimensional chains. This greatly complicates the task. Solving such problems requires a special approach. The purpose of this article is to carry out the problem of improving the technology of assembly of aviation units by use of computer calculations. One of the actual examples of the assembly unit, in which there is an interacting dimensional chain, is the turbine wheel of gas turbine engine. Dimensional chain of turbine wheel is formed by geometric parameters of disk and set of blades. The interaction of the dimensional chain consists in the formation of two chains. The first chain is formed by the dimensions that determine the location of the grooves for the installation of the blades, and the dimensions of the blade roots. The second dimensional chain is formed by the dimensions of the airfoil shroud platform. The interaction of the dimensional chain of the turbine wheel is the interdependence of the first and second chains by means of power circuits formed by a plurality of middle parts of the turbine blades. The timeliness of the calculation of the dimensional chain of the turbine wheel is the need to improve the technology of assembly of this unit. The task at hand contains geometric and mathematical components; therefore, its solution can be implemented following the algorithm: 1) research and analysis of production errors by geometric parameters; 2) development of a parametric model in the CAD system; 3) creation of set of CAD-models of details taking into account actual or generalized distributions of errors of geometrical parameters; 4) calculation model in the CAE-system, loading of various combinations of models of parts; 5) the accumulation of statistics and analysis. The main task is to pre-simulate the assembly process by calculating the interacting dimensional chains. The article describes the approach to the solution from the point of view of mathematical statistics, implemented in the software package Matlab. Within the framework of the study, there are data on the measurement of the components of the turbine wheel-blades and disks, as a result of which it is expected that the assembly process of the unit will be optimized by solving dimensional chains.Keywords: accuracy, assembly, interacting dimension chains, turbine
Procedia PDF Downloads 3733293 Proteomic Analysis of 2,4-Epibrassinolide Alleviating Low Temperature Stress in Rice Seedling Leaves
Authors: Jiang Xu, Daoping Wang, Qun Li, Yinghong Pan
Abstract:
2,4-Epibrassinolide (EBR), which is a kind of plant hormone Brassinosteroids (BRs), is widely studied and applied in the global scale but the proteomic characteristics of EBR alleviating low temperature stress in rice seedling leaves are still not clear. In this study, seeding rice of Nipponbare were treated with EBR and distilled water, then stressed at 4℃ or 26 ℃, and analyzed by mass spectrometry analysis, verified by parallel reaction monitoring technique (PRM). The results showed that 5778 proteins were identified in total and 4834 proteins were identified with quantitative information. Among them, 401 up-regulated and 220 down-regulated proteins may be related to EBR alleviating low temperature stress in rice seedling leaves. The molecular functions of most of up-regulated proteins are RNA binding and hydrolase activity and are mainly enriched in the pathways of carbon metabolism, folic acid synthesis, and amino acid biosynthesis. The down-regulated proteins are mainly related to catalytic activity and oxidoreductase activity and are mainly enriched in the pathways of limonene and pinene degradation, riboflavin metabolism, porphyrin and chlorophyll metabolism, and other metabolic pathways. PRM validation and literature analysis showed that NADP-malic acidase, peroxidase, 3-phosphoglycerate dehydrogenase, enolase, glyceraldehyde-3- phosphate dehydrogenase and pyruvate kinase are closely related to the effect of EBR on low temperature stress. These results also suggested that BRs could relieve the effect of low temperature stress on rice seed germination in many ways.Keywords: 2, 4-Epibrassinolid, low temperature stress, proteomic analysis, rice
Procedia PDF Downloads 1623292 Sector-Wide Collaboration to Reduce Food Waste
Authors: Carolyn Cameron
Abstract:
Stop Food Waste Australia is working with the industry to co-design sector action plans to prevent and reduce food waste across the supply chain. We are a public-private partnership, funded in 2021 by the Australian national government under the 2017 National Food Waste Strategy. Our partnership has representatives from all levels of government, industry associations from farm to fork, and food rescue groups. Like many countries, Australia has adopted the Sustainable Development Goal (SDG) target of 12.3 to halve food waste by 2030. A seminal 2021 study, the National Food Waste Feasibility Report, developed a robust national baseline, illustrating hotspots in commodities and across the supply chain. This research found that the consumption stages – households, food service, and institutions - account for over half of all food waste, and 22% of food produced never leaves the farm gate. Importantly the study found it is feasible for Australia to meet SDG 12.3, but it will require unprecedented action by governments, industry, and the community. Sector Action Plans (Plan) are one of the four main initiatives of Stop Food Waste Australia, including a voluntary commitment, a coordinated food waste communications hub, and robust monitoring and reporting framework. These plans provide a systems-based approach to reducing food loss and waste while realising multiple benefits for supply chain partners and other collaborators. Each plan is being co-designed with the key stakeholders most able to directly control or influence the root cause(s) of food waste hotspots and to take action to reduce or eliminate food waste in their value chain. The initiatives in the Plans are fit-for-purpose, reflecting current knowledge and recognising priorities may refocus over time. To date, sector action plans have been developed with the Food Rescue, Cold Chain, Bread and Bakery, and Dairy Sectors. Work is currently underway on Meat and Horticulture, and we are also developing supply-chain stage plans for food services and institutions. The study will provide an overview of Australia’s food waste baseline and challenges, the important role of sector action plans in reducing food waste, and case studies of implementation outcomes.Keywords: co-design, horticulture, sector action plans, voluntary
Procedia PDF Downloads 1353291 Development of a Decision Model to Optimize Total Cost in Food Supply Chain
Authors: Henry Lau, Dilupa Nakandala, Li Zhao
Abstract:
All along the length of the supply chain, fresh food firms face the challenge of managing both product quality, due to the perishable nature of the products, and product cost. This paper develops a method to assist logistics managers upstream in the fresh food supply chain in making cost optimized decisions regarding transportation, with the objective of minimizing the total cost while maintaining the quality of food products above acceptable levels. Considering the case of multiple fresh food products collected from multiple farms being transported to a warehouse or a retailer, this study develops a total cost model that includes various costs incurred during transportation. The practical application of the model is illustrated by using several computational intelligence approaches including Genetic Algorithms (GA), Fuzzy Genetic Algorithms (FGA) as well as an improved Simulated Annealing (SA) procedure applied with a repair mechanism for efficiency benchmarking. We demonstrate the practical viability of these approaches by using a simulation study based on pertinent data and evaluate the simulation outcomes. The application of the proposed total cost model was demonstrated using three approaches of GA, FGA and SA with a repair mechanism. All three approaches are adoptable; however, based on the performance evaluation, it was evident that the FGA is more likely to produce a better performance than the other two approaches of GA and SA. This study provides a pragmatic approach for supporting logistics and supply chain practitioners in fresh food industry in making important decisions on the arrangements and procedures related to the transportation of multiple fresh food products to a warehouse from multiple farms in a cost-effective way without compromising product quality. This study extends the literature on cold supply chain management by investigating cost and quality optimization in a multi-product scenario from farms to a retailer and, minimizing cost by managing the quality above expected quality levels at delivery. The scalability of the proposed generic function enables the application to alternative situations in practice such as different storage environments and transportation conditions.Keywords: cost optimization, food supply chain, fuzzy sets, genetic algorithms, product quality, transportation
Procedia PDF Downloads 2243290 Between a Rock and a Hard Place: The Impact of Inflation on Global Supply Chains
Authors: Elad Harison
Abstract:
The paper identifies the complex links between post-COVID-19 inflationary pressures and global supply chains. Throughout the COVID-19 lockdowns and long periods after the termination of social distancing policies, consumers, notably in the U.S., have confronted and still face disruptions in the supply of goods. The study analyzes the monetary policy in the U.S. that led to the significant shift in consumer demand during a limited supply period, hence resulting in shortages and emphasizing inflationary dynamics. We argue that the monetary guidelines applied by the U.S. government further elevated the scope of supply chain disruptions.Keywords: consumer demand, COVID-19, inflation, monetary policy, supply chain
Procedia PDF Downloads 933289 Arabic Light Word Analyser: Roles with Deep Learning Approach
Authors: Mohammed Abu Shquier
Abstract:
This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN
Procedia PDF Downloads 443288 The Competitive Power of Supply Chain Quality Management in Manufacturing Companies in Cameroon
Authors: Nicodemus Tiendem, Arrey Mbayong Napoleon
Abstract:
The heightening of competition and the quest for market share has left business persons and research communities re-examining and reinventing their competitive practices. A case in point is Porter’s generic strategy which has received a lot of criticism lately regarding its inability to maintain a company’s competitive power. This is because it focuses more on the organisation and ignores her external partners, who have a strong bearing on the company’s performance. This paper, therefore, sought to examine Porter’s generic strategies alongside supply chain quality management practices in terms of their effectiveness in building the competitive power of manufacturing companies in Cameroon. This was done with the use of primary data captured from a survey study across the supply chains of 20 manufacturing companies in Cameroon using a five-point Likert scale questionnaire. For each company, four 1st tier suppliers and four 1st tier distributors were carefully chosen to participate in the study alongside the companies themselves. In each case, attention was directed to persons involved in the supply chains of the companies. This gave a total of 180 entities comprising the supply chains of the 20 manufacturing companies involved in the study, making a total of 900 participants. The data was analysed using three multiple regression models to assess the effect of Porter’s generic strategy and supply chain quality management on the marketing performance of the companies. The findings proved that in such a competitive atmosphere, supply chain quality management is a better tool for marketing performance over Porter’s generic strategies and hence building the competitive power of the companies at all levels of the study. Although the study made use of convenience sampling, where sample selectivity biases the results, the findings aligned with many other recent developments in line with building the competitive power of manufacturing companies and thereby made the findings suitable for generalisation.Keywords: supply chain quality management, Porter’s generic strategies, competitive power, marketing performance, manufacturing companies, Cameroon
Procedia PDF Downloads 883287 Elemental Graph Data Model: A Semantic and Topological Representation of Building Elements
Authors: Yasmeen A. S. Essawy, Khaled Nassar
Abstract:
With the rapid increase of complexity in the building industry, professionals in the A/E/C industry were forced to adopt Building Information Modeling (BIM) in order to enhance the communication between the different project stakeholders throughout the project life cycle and create a semantic object-oriented building model that can support geometric-topological analysis of building elements during design and construction. This paper presents a model that extracts topological relationships and geometrical properties of building elements from an existing fully designed BIM, and maps this information into a directed acyclic Elemental Graph Data Model (EGDM). The model incorporates BIM-based search algorithms for automatic deduction of geometrical data and topological relationships for each building element type. Using graph search algorithms, such as Depth First Search (DFS) and topological sortings, all possible construction sequences can be generated and compared against production and construction rules to generate an optimized construction sequence and its associated schedule. The model is implemented in a C# platform.Keywords: building information modeling (BIM), elemental graph data model (EGDM), geometric and topological data models, graph theory
Procedia PDF Downloads 3843286 The Perspective of Waste Frying Oil in São Paulo and Its Dimensions in the Reverse Logistics of the Production of Biodiesel
Authors: Max Filipe Goncalves, Alessandra Concilio, Rodrigo Shimada
Abstract:
The waste frying oil is highly pollutant when disposed incorrectly in the environment. Is necessary search of the Reverse Logistics to identify how can be structure to return the waste like this to productive chain and to be used in the new process. In this context, the objective of this paper is to analyze the perspective of the waste frying oil in São Paulo, and its dimensions in the production of biodiesel. Subjacent factors such as the agents, motivators and legal aspects were analyzed to demonstrate it. Then, the SWOT matrix was built with the aspects observed and the forces, weaknesses, opportunities and threats of the reverse logistic chain in São Paulo.Keywords: biodiesel, perspective, reverse logistic, WFO
Procedia PDF Downloads 2093285 Historical Development of Negative Emotive Intensifiers in Hungarian
Authors: Martina Katalin Szabó, Bernadett Lipóczi, Csenge Guba, István Uveges
Abstract:
In this study, an exhaustive analysis was carried out about the historical development of negative emotive intensifiers in the Hungarian language via NLP methods. Intensifiers are linguistic elements which modify or reinforce a variable character in the lexical unit they apply to. Therefore, intensifiers appear with other lexical items, such as adverbs, adjectives, verbs, infrequently with nouns. Due to the complexity of this phenomenon (set of sociolinguistic, semantic, and historical aspects), there are many lexical items which can operate as intensifiers. The group of intensifiers are admittedly one of the most rapidly changing elements in the language. From a linguistic point of view, particularly interesting are a special group of intensifiers, the so-called negative emotive intensifiers, that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g.borzasztóanjó ’awfully good’, which means ’excellent’). Despite their special semantic features, negative emotive intensifiers are scarcely examined in literature based on large Historical corpora via NLP methods. In order to become better acquainted with trends over time concerning the intensifiers, The exhaustively analysed a specific historical corpus, namely the Magyar TörténetiSzövegtár (Hungarian Historical Corpus). This corpus (containing 3 millions text words) is a collection of texts of various genres and styles, produced between 1772 and 2010. Since the corpus consists of raw texts and does not contain any additional information about the language features of the data (such as stemming or morphological analysis), a large amount of manual work was required to process the data. Thus, based on a lexicon of negative emotive intensifiers compiled in a previous phase of the research, every occurrence of each intensifier was queried, and the results were stored in a separate data frame. Then, basic linguistic processing (POS-tagging, lemmatization etc.) was carried out automatically with the ‘magyarlanc’ NLP-toolkit. Finally, the frequency and collocation features of all the negative emotive words were automatically analyzed in the corpus. Outcomes of the research revealed in detail how these words have proceeded through grammaticalization over time, i.e., they change from lexical elements to grammatical ones, and they slowly go through a delexicalization process (their negative content diminishes over time). What is more, it was also pointed out which negative emotive intensifiers are at the same stage in this process in the same time period. Giving a closer look to the different domains of the analysed corpus, it also became certain that during this process, the pragmatic role’s importance increases: the newer use expresses the speaker's subjective, evaluative opinion at a certain level.Keywords: historical corpus analysis, historical linguistics, negative emotive intensifiers, semantic changes over time
Procedia PDF Downloads 2333284 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods
Authors: Autcha Araveeporn
Abstract:
This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution
Procedia PDF Downloads 3573283 Upgrading along Value Chains: Strategies for Thailand's Functional Milk Industry
Authors: Panisa Harnpathananun
Abstract:
This paper is 'Practical Experience Analysis' which aims to analyze critical obstacles hampering the growth of the functional milk industry and suggest recommendations to overcome those obstacles. Using the Sectoral Innovation System (SIS) along value chain analysis, it is found that restriction in regulation of milk disinfection process, difficulty of dairy entrepreneurs for health claim approval of functional food and beverage and lack of intermediary between entrepreneurs and certified units for certification of functional foods and milk are major causes that needed to be resolved. Consequently, policy recommendations are proposed to tackle the problems occurring throughout the value chain. For the upstream, a collaborative platform using the quadruple helix model is proposed in a pattern of effective dairy cooperatives. For the midstream, regulation issues of new process, extended shelf life (ESL) milk, or prolonged milk are necessary, which can be extended the global market opportunity. For the downstream, mechanism of intermediary between entrepreneurs and certified units can be assisted in certified process of functional milk, especially a process of 'health claim' approval.Keywords: Thailand, functional milk, supply chain, quadruple helix, intermediary, functional food
Procedia PDF Downloads 1513282 A Fuzzy Decision Making Approach for Supplier Selection in Healthcare Industry
Authors: Zeynep Sener, Mehtap Dursun
Abstract:
Supplier evaluation and selection is one of the most important components of an effective supply chain management system. Due to the expanding competition in healthcare, selecting the right medical device suppliers offers great potential for increasing quality while decreasing costs. This paper proposes a fuzzy decision making approach for medical supplier selection. A real-world medical device supplier selection problem is presented to illustrate the application of the proposed decision methodology.Keywords: fuzzy decision making, fuzzy multiple objective programming, medical supply chain, supplier selection
Procedia PDF Downloads 4543281 Effect of Acids with Different Chain Lengths Modified by Methane Sulfonic Acid and Temperature on the Properties of Thermoplastic Starch/Glycerin Blends
Authors: Chi-Yuan Huang, Mei-Chuan Kuo, Ching-Yi Hsiao
Abstract:
In this study, acids with various chain lengths (C6, C8, C10 and C12) modified by methane sulfonic acid (MSA) and temperature were used to modify tapioca starch (TPS), then the glycerol (GA) were added into modified starch, to prepare new blends. The mechanical properties, thermal properties and physical properties of blends were studied. This investigation was divided into two parts. First, the biodegradable materials were used such as starch and glycerol with hexanedioic acid (HA), suberic acid (SBA), sebacic acid (SA), decanedicarboxylic acid (DA) manufacturing with different temperatures (90, 110 and 130 °C). And then, the solution was added into modified starch to prepare the blends by using single-screw extruder. The FT-IR patterns indicated that the characteristic peak of C=O in ester was observed at 1730 cm-1. It is proved that different chain length acids (C6, C8, C10 and C12) reacted with glycerol by esterification and these are used to plasticize blends during extrusion. In addition, the blends would improve the hydrolysis and thermal stability. The water contact angle increased from 43.0° to 64.0°. Second, the HA (110 °C), SBA (110 °C), SA (110 °C), and DA blends (130 °C) were used in study, because they possessed good mechanical properties, water resistances and thermal stability. On the other hand, the various contents (0, 0.005, 0.010, 0.020 g) of MSA were also used to modify the mechanical properties of blends. We observed that the blends were added to MSA, and then the FT-IR patterns indicated that the C=O ester appeared at 1730 cm-1. For this reason, the hydrophobic blends were produced. The water contact angle of the MSA blends increased from 55.0° to 71.0°. Although break elongation of the MSA blends reduced from the original 220% to 128%, the stress increased from 2.5 MPa to 5.1 MPa. Therefore, the optimal composition of blends was the DA blend (130 °C) with adding of MSA (0.005 g).Keywords: chain length acids, methane sulfonic acid, Tapioca starch (TPS), tensile stress
Procedia PDF Downloads 2503280 Investigating the Associative Network of Color Terms among Turkish University Students: A Cognitive-Based Study
Authors: R. Güçlü, E. Küçüksakarya
Abstract:
Word association (WA) gives the broadest information on how knowledge is structured in the human mind. Cognitive linguistics, psycholinguistics, and applied linguistics are the disciplines that consider WA tests as substantial in gaining insights into the very nature of the human cognitive system and semantic knowledge. In this study, Berlin and Kay’s basic 11 color terms (1969) are presented as the stimuli words to a total number of 300 Turkish university students. The responses are analyzed according to Fitzpatrick’s model (2007), including four categories, namely meaning-based responses, position-based responses, form-based responses, and erratic responses. In line with the findings, the responses to free association tests are expected to give much information about Turkish university students’ psychological structuring of vocabulary, especially morpho-syntactic and semantic relationships among words. To conclude, theoretical and practical implications are discussed to make an in-depth evaluation of how associations of basic color terms are represented in the mental lexicon of Turkish university students.Keywords: color term, gender, mental lexicon, word association task
Procedia PDF Downloads 1323279 Case Study of the Roma Tomato Distribution Chain: A Dynamic Interface for an Agricultural Enterprise in Mexico
Authors: Ernesto A. Lagarda-Leyva, Manuel A. Valenzuela L., José G. Oshima C., Arnulfo A. Naranjo-Flores
Abstract:
From August to December of 2016, a diagnostic and strategic planning study was carried out on the supply chain of the company Agropecuaria GABO S.A. de C.V. The final product of the study was the development of the strategic plan and a project portfolio to meet the demands of the three links in the supply chain of the Roma tomato exported annually to the United States of America. In this project, the strategic objective of ensuring the proper handling of the product was selected and one of the goals associated with this was the employment of quantitative methods to support decision making. Considering the antecedents, the objective of this case study was to develop a model to analyze the behavioral dynamics in the distribution chain, from the logistics of storage and shipment of Roma tomato in 81-case pallets (11.5 kg per case), to the two pre-cooling rooms and eventual loading onto transports, seeking to reduce the bottleneck and the associated costs by means of a dynamic interface. The methodology used was that of system dynamics, considering four phases that were adapted to the purpose of the study: 1) the conceptualization phase; 2) the formulation phase; 3) the evaluation phase; and 4) the communication phase. The main practical conclusions lead to the possibility of reducing both the bottlenecks in the cooling rooms and the costs by simulating scenarios and modifying certain policies. Furthermore, the creation of the dynamic interface between the model and the stakeholders was achieved by generating interaction with buttons and simple instructions that allow making modifications and observing diverse behaviors.Keywords: agrilogistics, distribution, scenarios, system dynamics
Procedia PDF Downloads 2313278 The Processing of Context-Dependent and Context-Independent Scalar Implicatures
Authors: Liu Jia’nan
Abstract:
The default accounts hold the view that there exists a kind of scalar implicature which can be processed without context and own a psychological privilege over other scalar implicatures which depend on context. In contrast, the Relevance Theorist regards context as a must because all the scalar implicatures have to meet the need of relevance in discourse. However, in Katsos, the experimental results showed: Although quantitatively the adults rejected under-informative utterance with lexical scales (context-independent) and the ad hoc scales (context-dependent) at almost the same rate, adults still regarded the violation of utterance with lexical scales much more severe than with ad hoc scales. Neither default account nor Relevance Theory can fully explain this result. Thus, there are two questionable points to this result: (1) Is it possible that the strange discrepancy is due to other factors instead of the generation of scalar implicature? (2) Are the ad hoc scales truly formed under the possible influence from mental context? Do the participants generate scalar implicatures with ad hoc scales instead of just comparing semantic difference among target objects in the under- informative utterance? In my Experiment 1, the question (1) will be answered by repetition of Experiment 1 by Katsos. Test materials will be showed by PowerPoint in the form of pictures, and each procedure will be done under the guidance of a tester in a quiet room. Our Experiment 2 is intended to answer question (2). The test material of picture will be transformed into the literal words in DMDX and the target sentence will be showed word-by-word to participants in the soundproof room in our lab. Reading time of target parts, i.e. words containing scalar implicatures, will be recorded. We presume that in the group with lexical scale, standardized pragmatically mental context would help generate scalar implicature once the scalar word occurs, which will make the participants hope the upcoming words to be informative. Thus if the new input after scalar word is under-informative, more time will be cost for the extra semantic processing. However, in the group with ad hoc scale, scalar implicature may hardly be generated without the support from fixed mental context of scale. Thus, whether the new input is informative or not does not matter at all, and the reading time of target parts will be the same in informative and under-informative utterances. People’s mind may be a dynamic system, in which lots of factors would co-occur. If Katsos’ experimental result is reliable, will it shed light on the interplay of default accounts and context factors in scalar implicature processing? We might be able to assume, based on our experiments, that one single dominant processing paradigm may not be plausible. Furthermore, in the processing of scalar implicature, the semantic interpretation and the pragmatic interpretation may be made in a dynamic interplay in the mind. As to the lexical scale, the pragmatic reading may prevail over the semantic reading because of its greater exposure in daily language use, which may also lead the possible default or standardized paradigm override the role of context. However, those objects in ad hoc scale are not usually treated as scalar membership in mental context, and thus lexical-semantic association of the objects may prevent their pragmatic reading from generating scalar implicature. Only when the sufficient contextual factors are highlighted, can the pragmatic reading get privilege and generate scalar implicature.Keywords: scalar implicature, ad hoc scale, dynamic interplay, default account, Mandarin Chinese processing
Procedia PDF Downloads 324