Search results for: markov chain monte carlo
1372 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines
Authors: Kamyar Tolouei, Ehsan Moosavi
Abstract:
In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization
Procedia PDF Downloads 1051371 Opportunities for Reducing Post-Harvest Losses of Cactus Pear (Opuntia Ficus-Indica) to Improve Small-Holder Farmers Income in Eastern Tigray, Northern Ethiopia: Value Chain Approach
Authors: Meron Zenaselase Rata, Euridice Leyequien Abarca
Abstract:
The production of major crops in Northern Ethiopia, especially the Tigray Region, is at subsistence level due to drought, erratic rainfall, and poor soil fertility. Since cactus pear is a drought-resistant plant, it is considered as a lifesaver fruit and a strategy for poverty reduction in a drought-affected area of the region. Despite its contribution to household income and food security in the area, the cactus pear sub-sector is experiencing many constraints with limited attention given to its post-harvest loss management. Therefore, this research was carried out to identify opportunities for reducing post-harvest losses and recommend possible strategies to reduce post-harvest losses, thereby improving production and smallholder’s income. Both probability and non-probability sampling techniques were employed to collect the data. Ganta Afeshum district was selected from Eastern Tigray, and two peasant associations (Buket and Golea) were also selected from the district purposively for being potential in cactus pear production. Simple random sampling techniques were employed to survey 30 households from each of the two peasant associations, and a semi-structured questionnaire was used as a tool for data collection. Moreover, in this research 2 collectors, 2 wholesalers, 1 processor, 3 retailers, 2 consumers were interviewed; and two focus group discussion was also done with 14 key farmers using semi-structured checklist; and key informant interview with governmental and non-governmental organizations were interviewed to gather more information about the cactus pear production, post-harvest losses, the strategies used to reduce the post-harvest losses and suggestions to improve the post-harvest management. To enter and analyze the quantitative data, SPSS version 20 was used, whereas MS-word were used to transcribe the qualitative data. The data were presented using frequency and descriptive tables and graphs. The data analysis was also done using a chain map, correlations, stakeholder matrix, and gross margin. Mean comparisons like ANOVA and t-test between variables were used. The analysis result shows that the present cactus pear value chain involves main actors and supporters. However, there is inadequate information flow and informal market linkages among actors in the cactus pear value chain. The farmer's gross margin is higher when they sell to the processor than sell to collectors. The significant postharvest loss in the cactus pear value chain is at the producer level, followed by wholesalers and retailers. The maximum and minimum volume of post-harvest losses at the producer level is 4212 and 240 kgs per season. The post-harvest loss was caused by limited farmers skill on-farm management and harvesting, low market price, limited market information, absence of producer organization, poor post-harvest handling, absence of cold storage, absence of collection centers, poor infrastructure, inadequate credit access, using traditional transportation system, absence of quality control, illegal traders, inadequate research and extension services and using inappropriate packaging material. Therefore, some of the recommendations were providing adequate practical training, forming producer organizations, and constructing collection centers.Keywords: cactus pear, post-harvest losses, profit margin, value-chain
Procedia PDF Downloads 1311370 The Democratization of 3D Capturing: An Application Investigating Google Tango Potentials
Authors: Carlo Bianchini, Lorenzo Catena
Abstract:
The appearance of 3D scanners and then, more recently, of image-based systems that generate point clouds directly from common digital images have deeply affected the survey process in terms of both capturing and 2D/3D modelling. In this context, low cost and mobile systems are increasingly playing a key role and actually paving the way to the democratization of what in the past was the realm of few specialized technicians and expensive equipment. The application of Google Tango on the ancient church of Santa Maria delle Vigne in Pratica di Mare – Rome presented in this paper is one of these examples.Keywords: the architectural survey, augmented/mixed/virtual reality, Google Tango project, image-based 3D capturing
Procedia PDF Downloads 1491369 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 1611368 Readout Development of a LGAD-based Hybrid Detector for Microdosimetry (HDM)
Authors: Pierobon Enrico, Missiaggia Marta, Castelluzzo Michele, Tommasino Francesco, Ricci Leonardo, Scifoni Emanuele, Vincezo Monaco, Boscardin Maurizio, La Tessa Chiara
Abstract:
Clinical outcomes collected over the past three decades have suggested that ion therapy has the potential to be a treatment modality superior to conventional radiation for several types of cancer, including recurrences, as well as for other diseases. Although the results have been encouraging, numerous treatment uncertainties remain a major obstacle to the full exploitation of particle radiotherapy. To overcome therapy uncertainties optimizing treatment outcome, the best possible radiation quality description is of paramount importance linking radiation physical dose to biological effects. Microdosimetry was developed as a tool to improve the description of radiation quality. By recording the energy deposition at the micrometric scale (the typical size of a cell nucleus), this approach takes into account the non-deterministic nature of atomic and nuclear processes and creates a direct link between the dose deposited by radiation and the biological effect induced. Microdosimeters measure the spectrum of lineal energy y, defined as the energy deposition in the detector divided by most probable track length travelled by radiation. The latter is provided by the so-called “Mean Chord Length” (MCL) approximation, and it is related to the detector geometry. To improve the characterization of the radiation field quality, we define a new quantity replacing the MCL with the actual particle track length inside the microdosimeter. In order to measure this new quantity, we propose a two-stage detector consisting of a commercial Tissue Equivalent Proportional Counter (TEPC) and 4 layers of Low Gain Avalanche Detectors (LGADs) strips. The TEPC detector records the energy deposition in a region equivalent to 2 um of tissue, while the LGADs are very suitable for particle tracking because of the thickness thinnable down to tens of micrometers and fast response to ionizing radiation. The concept of HDM has been investigated and validated with Monte Carlo simulations. Currently, a dedicated readout is under development. This two stages detector will require two different systems to join complementary information for each event: energy deposition in the TEPC and respective track length recorded by LGADs tracker. This challenge is being addressed by implementing SoC (System on Chip) technology, relying on Field Programmable Gated Arrays (FPGAs) based on the Zynq architecture. TEPC readout consists of three different signal amplification legs and is carried out thanks to 3 ADCs mounted on a FPGA board. LGADs activated strip signal is processed thanks to dedicated chips, and finally, the activated strip is stored relying again on FPGA-based solutions. In this work, we will provide a detailed description of HDM geometry and the SoC solutions that we are implementing for the readout.Keywords: particle tracking, ion therapy, low gain avalanche diode, tissue equivalent proportional counter, microdosimetry
Procedia PDF Downloads 1751367 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos
Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog
Abstract:
Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.Keywords: vaccinated, unvaccinated, socoal distancing, filipinos
Procedia PDF Downloads 2021366 DesignChain: Automated Design of Products Featuring a Large Number of Variants
Authors: Lars Rödel, Jonas Krebs, Gregor Müller
Abstract:
The growing price pressure due to the increasing number of global suppliers, the growing individualization of products and ever-shorter delivery times are upcoming challenges in the industry. In this context, Mass Personalization stands for the individualized production of customer products in batch size 1 at the price of standardized products. The possibilities of digitalization and automation of technical order processing open up the opportunity for companies to significantly reduce their cost of complexity and lead times and thus enhance their competitiveness. Many companies already use a range of CAx tools and configuration solutions today. Often, the expert knowledge of employees is hidden in "knowledge silos" and is rarely networked across processes. DesignChain describes the automated digital process from the recording of individual customer requirements, through design and technical preparation, to production. Configurators offer the possibility of mapping variant-rich products within the Design Chain. This transformation of customer requirements into product features makes it possible to generate even complex CAD models, such as those for large-scale plants, on a rule-based basis. With the aid of an automated CAx chain, production-relevant documents are thus transferred digitally to production. This process, which can be fully automated, allows variants to always be generated on the basis of current version statuses.Keywords: automation, design, CAD, CAx
Procedia PDF Downloads 761365 A Semantic and Concise Structure to Represent Human Actions
Authors: Tobias Strübing, Fatemeh Ziaeetabar
Abstract:
Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis
Procedia PDF Downloads 1261364 Safety and Efficacy of Recombinant Clostridium botulinum Types B Vaccine Candidate
Authors: Mi-Hye Hwang, Young Min Son, Kichan Lee, Bang-Hun Hyun, Byeong Yeal Jung
Abstract:
Botulism is a paralytic disease of human beings and animals caused by neurotoxin produced by Clostridium botulinum. The neurotoxins are genetically distinguished into 8 types, A to H. Ingestion of performed toxin, usually types B, C, and D, have been shown to produce diseases in most cases of cattle botulism. Vaccination is the best measure to prevent cattle botulism. However, the commercially available toxoid-based vaccines are difficult and hazardous to produce. We produced recombinant protein using gene of heavy chain domain of botulinum toxin B of which binds to cellular receptor of neuron cells and used as immunogen. In this study, we evaluated the safety and efficacy of botulism vaccine composed of recombinant types B. Safety test was done by National Regulation for Veterinary Biologicals. For efficacy test, female ICR mice (5 weeks old) were subcutaneously injected, intraperitoneally challenged, and examined the survival rates compared with vaccination and non-vaccination group. Mouse survival rate of recombinant types B vaccine was above 80%, while one of non-vaccination group was 0%. A vaccine composed of recombinant types B was safe and efficacious in mouse. Our results suggest that recombinant heavy chain receptor binding domain can be used as an effective vaccine candidate for type B botulism.Keywords: botulism, livestock, vaccine, recombinant protein, toxin
Procedia PDF Downloads 2391363 Starch Valorization: Biorefinery Concept for the Circular Bioeconomy
Authors: Maider Gómez Palmero, Ana Carrasco Pérez, Paula de la Sen de la Cruz, Francisco Javier Royo Herrer, Sonia Ascaso Malo
Abstract:
The production of bio-based products for different purposes is one of the strategies that has grown the most at European and even global levels, seeking to contribute to mitigating the impacts associated with climate change and to achieve the ambitious objectives set in this regard. However, the substitution of fossil-based products for bio-based products requires a challenging and deep transformation and adaptation of the secondary and primary sectors and, more specifically, in the latter, the agro-industries. The first step to developing a bio-based value chain focuses on the availability of a resource with the right characteristics for the substitution sought. This, in turn, requires a significant reshaping of the forestry/agricultural sector but also of the agro-industry, which has a relevant potential to be deployed as a supplier and develop a robust logistical supply chain and to market a biobased raw material at a competitive price. However, this transformation may involve a profound restructuring of its traditional business model to incorporate biorefinery concepts. In this sense, agro-industries that generate by-products in their processes that are currently not valorized, such as potato processing rejects or the starch found in washing water, constitute a potential raw material that can be used for different bio-applications. This article aims to explore this potential to evaluate the most suitable bio applications to target and identify opportunities and challenges.Keywords: starch valorisation, biorefinery, bio-based raw materials, bio-applications
Procedia PDF Downloads 511362 Tornado Disaster Impacts and Management: Learning from the 2016 Tornado Catastrophe in Jiangsu Province, China
Authors: Huicong Jia, Donghua Pan
Abstract:
As a key component of disaster reduction management, disaster emergency relief and reconstruction is an important process. Based on disaster system theory, this study analyzed the Jiangsu tornado from the formation mechanism of disasters, through to the economic losses, loss of life, and social infrastructure losses along the tornado disaster chain. The study then assessed the emergency relief and reconstruction efforts, based on an analytic hierarchy process method. The results were as follows: (1) An unstable weather system was the root cause of the tornado. The potentially hazardous local environment, acting in concert with the terrain and the river network, was able to gather energy from the unstable atmosphere. The wind belt passed through a densely populated district, with vulnerable infrastructure and other hazard-prone elements, which led to an accumulative disaster situation and the triggering of a catastrophe. (2) The tornado was accompanied by a hailstorm, which is an important triggering factor for a tornado catastrophe chain reaction. (3) The evaluation index (EI) of the emergency relief and reconstruction effect for the ‘‘6.23’’ tornado disaster in Yancheng was 91.5. Compared to other relief work in areas affected by disasters of the same magnitude, there was a more successful response than has previously been experienced. The results provide new insights for studies of disaster systems and the recovery measures in response to tornado catastrophe in China.Keywords: China, disaster system, emergency relief, tornado catastrophe
Procedia PDF Downloads 2701361 Technological Innovations as a Potential Vehicle for Supply Chain Integration on Basic Metal Industries
Authors: Alie Wube Dametew, Frank Ebinger
Abstract:
This study investigated the roles of technological innovation on basic metal industries and then developed technological innovation framework for enhancing sustainable competitive advantage in the basic metal industries. The previous research work indicates that technological innovation has critical impact in promoting local industries to improve their performance and achieve sustainable competitive environments. The filed observation, questioner and expert interview result from basic metal industries indicate that the technological capability of local industries to invention, adoption, modification, improving and use a given innovative technology is very poor. As the result, this poor technological innovation was occurred due to improper innovation and technology transfer framework, non-collaborative operating environment between foreign and local industries, very weak national technology policies, problems research and innovation centers, the common miss points on basic metal industry innovation systems were investigated in this study. One of the conclusions of the article is that, through using the developed technological innovation framework in this study, basic metal industries improve innovation process and support an innovative culture for sector capabilities and achieve sustainable competitive advantage.Keywords: technological innovation, competitive advantage, sustainable, basic metal industry, conceptual model, sustainability, supply chain integration
Procedia PDF Downloads 2451360 Propagation of Ultra-High Energy Cosmic Rays through Extragalactic Magnetic Fields: An Exploratory Study of the Distance Amplification from Rectilinear Propagation
Authors: Rubens P. Costa, Marcelo A. Leigui de Oliveira
Abstract:
The comprehension of features on the energy spectra, the chemical compositions, and the origins of Ultra-High Energy Cosmic Rays (UHECRs) - mainly atomic nuclei with energies above ~1.0 EeV (exa-electron volts) - are intrinsically linked to the problem of determining the magnitude of their deflections in cosmic magnetic fields on cosmological scales. In addition, as they propagate from the source to the observer, modifications are expected in their original energy spectra, anisotropy, and the chemical compositions due to interactions with low energy photons and matter. This means that any consistent interpretation of the nature and origin of UHECRs has to include the detailed knowledge of their propagation in a three-dimensional environment, taking into account the magnetic deflections and energy losses. The parameter space range for the magnetic fields in the universe is very large because the field strength and especially their orientation have big uncertainties. Particularly, the strength and morphology of the Extragalactic Magnetic Fields (EGMFs) remain largely unknown, because of the intrinsic difficulty of observing them. Monte Carlo simulations of charged particles traveling through a simulated magnetized universe is the straightforward way to study the influence of extragalactic magnetic fields on UHECRs propagation. However, this brings two major difficulties: an accurate numerical modeling of charged particles diffusion in magnetic fields, and an accurate numerical modeling of the magnetized Universe. Since magnetic fields do not cause energy losses, it is important to impose that the particle tracking method conserve the particle’s total energy and that the energy changes are results of the interactions with background photons only. Hence, special attention should be paid to computational effects. Additionally, because of the number of particles necessary to obtain a relevant statistical sample, the particle tracking method must be computationally efficient. In this work, we present an analysis of the propagation of ultra-high energy charged particles in the intergalactic medium. The EGMFs are considered to be coherent within cells of 1 Mpc (mega parsec) diameter, wherein they have uniform intensities of 1 nG (nano Gauss). Moreover, each cell has its field orientation randomly chosen, and a border region is defined such that at distances beyond 95% of the cell radius from the cell center smooth transitions have been applied in order to avoid discontinuities. The smooth transitions are simulated by weighting the magnetic field orientation by the particle's distance to the two nearby cells. The energy losses have been treated in the continuous approximation parameterizing the mean energy loss per unit path length by the energy loss length. We have shown, for a particle with the typical energy of interest the integration method performance in the relative error of Larmor radius, without energy losses and the relative error of energy. Additionally, we plotted the distance amplification from rectilinear propagation as a function of the traveled distance, particle's magnetic rigidity, without energy losses, and particle's energy, with energy losses, to study the influence of particle's species on these calculations. The results clearly show when it is necessary to use a full three-dimensional simulation.Keywords: cosmic rays propagation, extragalactic magnetic fields, magnetic deflections, ultra-high energy
Procedia PDF Downloads 1271359 A Life Cycle Assessment (LCA) of Aluminum Production Process
Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour
Abstract:
The production of aluminium alloys and ingots -starting from the processing of alumina to aluminium, and the final cast product- was studied using a Life Cycle Assessment (LCA) approach. The studied aluminium supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminium metal were investigated. The impact of the aluminium production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it comes to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.Keywords: life cycle assessment, aluminium production, supply chain, ecological impacts
Procedia PDF Downloads 5321358 4-DOFs Parallel Mechanism for Minimally Invasive Robotic Surgery
Authors: Khalil Ibrahim, Ahmed Ramadan, Mohamed Fanni, Yo Kobayashi, Ahmed Abo-Ismail, Masakatus G. Fujie
Abstract:
This paper deals with the design process and the dynamic control simulation of a new type of 4-DOFs parallel mechanism that can be used as an endoscopic surgical manipulator. The proposed mechanism, 2-PUU_2-PUS, is designed based on the screw theory and the parallel virtual chain type synthesis method. Based on the structure analysis of the 4-DOF parallel mechanism, the inverse position equation is studied using the inverse analysis theory of kinematics. The design and the stress analysis of the mechanism are investigated using SolidWorks software. The virtual prototype of the parallel mechanism is constructed, and the dynamic simulation is performed using ADAMS TM software. The system model utilizing PID and PI controllers has been built using MATLAB software. A more realistic simulation in accordance with a given bending angle and point to point control is implemented by the use of both ADAMS/MATLAB software. The simulation results showed that this control method has solved the coordinate control for the 4-DOF parallel manipulator so that each output is feedback to the four driving rods. From the results, the tracking performance is achieved. Other control techniques, such as intelligent ones, are recommended to improve the tracking performance and reduce the numerical truncation error.Keywords: parallel mechanisms, medical robotics, tracjectory control, virtual chain type synthesis method
Procedia PDF Downloads 4681357 Linking Information Systems Capabilities for Service Quality: The Role of Customer Connection and Environmental Dynamism
Authors: Teng Teng, Christos Tsinopoulos
Abstract:
The purpose of this research is to explore the link between IS capabilities, customer connection, and quality performance in the service context, with investigation of the impact of firm’s stable and dynamic environments. The application of Information Systems (IS) has become a significant effect on contemporary service operations. Firms invest in IS with the presumption that they will facilitate operations processes so that their performance will improve. Yet, IS resources by themselves are not sufficiently 'unique' and thus, it would be more useful and theoretically relevant to focus on the processes they affect. One such organisational process, which has attracted a lot of research attention by supply chain management scholars, is the integration of customer connection, where IS-enabled customer connection enhances communication and contact processes, and with such customer resources integration comes greater success for the firm in its abilities to develop a good understanding of customer needs and set accurate customer. Nevertheless, prior studies on IS capabilities have focused on either one specific type of technology or operationalised it as a highly aggregated concept. Moreover, although conceptual frameworks have been identified to show customer integration is valuable in service provision, there is much to learn about the practices of integrating customer resources. In this research, IS capabilities have been broken down into three dimensions based on the framework of Wade and Hulland: IT for supply chain activities (ITSCA), flexible IT infrastructure (ITINF), and IT operations shared knowledge (ITOSK); and focus on their impact on operational performance of firms in services. With this background, this paper addresses the following questions: -How do IS capabilities affect the integration of customer connection and service quality? -What is the relationship between environmental dynamism and the relationship of customer connection and service quality? A survey of 156 service establishments was conducted, and the data analysed to determine the role of customer connection in mediating the effects of IS capabilities on firms’ service quality. Confirmatory factor analysis was used to check convergent validity. There is a good model fit for the structural model. Moderating effect of environmental dynamism on the relationship of customer connection and service quality is analysed. Results show that ITSCA, ITINF, and ITOSK have a positive influence on the degree of the integration of customer connection. In addition, customer connection positively related to service quality; this relationship is further emphasised when firms work in a dynamic environment. This research takes a step towards quelling concerns about the business value of IS, contributing to the development and validation of the measurement of IS capabilities in the service operations context. Additionally, it adds to the emerging body of literature linking customer connection to the operational performance of service firms. Managers of service firms should consider the strength of the mediating role of customer connection when investing in IT-related technologies and policies. Particularly, service firms developing IS capabilities should simultaneously implement processes that encourage supply chain integration.Keywords: customer connection, environmental dynamism, information systems capabilities, service quality, service supply chain
Procedia PDF Downloads 1401356 AHP and TOPSIS Methods for Supplier Selection Problem in Medical Devices Company
Authors: Sevde D. Karayel, Ediz Atmaca
Abstract:
Supplier selection subject is vital because of development competitiveness and performance of firms which have right, rapid and with low cost procurement. Considering the fact that competition between firms is no longer on their supply chains, hence it is very clear that performance of the firms’ not only depend on their own success but also success of all departments in supply chain. For this purpose, firms want to work with suppliers which are cost effective, flexible in terms of demand and high quality level for customer satisfaction. However, diversification and redundancy of their expectations from suppliers, supplier selection problems need to be solved as a hard problem. In this study, supplier selection problem is discussed for critical piece, which is using almost all production of products in and has troubles with lead time from supplier, in a firm that produces medical devices. Analyzing policy in the current situation of the firm in the supplier selection indicates that supplier selection is made based on the purchasing department experience and other authorized persons’ general judgments. Because selection do not make based on the analytical methods, it is caused disruptions in production, lateness and extra cost. To solve the problem, AHP and TOPSIS which are multi-criteria decision making techniques, which are effective, easy to implement and can analyze many criteria simultaneously, are used to make a selection among alternative suppliers.Keywords: AHP-TOPSIS methods, multi-criteria decision making, supplier selection problem, supply chain management
Procedia PDF Downloads 2641355 An Infrared Inorganic Scintillating Detector Applied in Radiation Therapy
Authors: Sree Bash Chandra Debnath, Didier Tonneau, Carole Fauquet, Agnes Tallet, Julien Darreon
Abstract:
Purpose: Inorganic scintillating dosimetry is the most recent promising technique to solve several dosimetric issues and provide quality assurance in radiation therapy. Despite several advantages, the major issue of using scintillating detectors is the Cerenkov effect, typically induced in the visible emission range. In this context, the purpose of this research work is to evaluate the performance of a novel infrared inorganic scintillator detector (IR-ISD) in the radiation therapy treatment to ensure Cerenkov free signal and the best matches between the delivered and prescribed doses during treatment. Methods: A simple and small-scale infrared inorganic scintillating detector of 100 µm diameter with a sensitive scintillating volume of 2x10-6 mm3 was developed. A prototype of the dose verification system has been introduced based on PTIR1470/F (provided by Phosphor Technology®) material used in the proposed novel IR-ISD. The detector was tested on an Elekta LINAC system tuned at 6 MV/15MV and a brachytherapy source (Ir-192) used in the patient treatment protocol. The associated dose rate was measured in count rate (photons/s) using a highly sensitive photon counter (sensitivity ~20ph/s). Overall measurements were performed in IBATM water tank phantoms by following international Technical Reports series recommendations (TRS 381) for radiotherapy and TG43U1 recommendations for brachytherapy. The performance of the detector was tested through several dosimetric parameters such as PDD, beam profiling, Cerenkov measurement, dose linearity, dose rate linearity repeatability, and scintillator stability. Finally, a comparative study is also shown using a reference microdiamond dosimeter, Monte-Carlo (MC) simulation, and data from recent literature. Results: This study is highlighting the complete removal of the Cerenkov effect especially for small field radiation beam characterization. The detector provides an entire linear response with the dose in the 4cGy to 800 cGy range, independently of the field size selected from 5 x 5 cm² down to 0.5 x 0.5 cm². A perfect repeatability (0.2 % variation from average) with day-to-day reproducibility (0.3% variation) was observed. Measurements demonstrated that ISD has superlinear behavior with dose rate (R2=1) varying from 50 cGy/s to 1000 cGy/s. PDD profiles obtained in water present identical behavior with a build-up maximum depth dose at 15 mm for different small fields irradiation. A low dimension of 0.5 x 0.5 cm² field profiles have been characterized, and the field cross profile presents a Gaussian-like shape. The standard deviation (1σ) of the scintillating signal remains within 0.02% while having a very low convolution effect, thanks to lower sensitive volume. Finally, during brachytherapy, a comparison with MC simulations shows that considering energy dependency, measurement agrees within 0.8% till 0.2 cm source to detector distance. Conclusion: The proposed scintillating detector in this study shows no- Cerenkov radiation and efficient performance for several radiation therapy measurement parameters. Therefore, it is anticipated that the IR-ISD system can be promoted to validate with direct clinical investigations, such as appropriate dose verification and quality control in the Treatment Planning System (TPS).Keywords: IR-Scintillating detector, dose measurement, micro-scintillators, Cerenkov effect
Procedia PDF Downloads 1821354 Development of Alternative Fuels Technologies: Compressed Natural Gas Home Refueling Station
Authors: Szymon Kuczynski, Krystian Liszka, Mariusz Laciak, Andrii Oliinyk, Adam Szurlej
Abstract:
Compressed natural gas (CNG) represents an excellent compromise between the availability of a technology that is proven and relatively easy to use in many areas of the automotive industry and incurred costs. This fuel causes a lower corrosion effect due to the lower content of products causing the potential difference on the walls of the engine system. Natural gas powered vehicles (NGVs) do not emit any substances that can contaminate water or land. The absence of carcinogenic substances in gaseous fuel extends the life of the engine. In the longer term, it contributes positively to waste management as well as waste disposal. Popularization of propulsion systems powered by natural gas CNG positively affects the reduction of heavy duty transport. For these reasons, CNG as a fuel stimulates considerable interest around the world. Over the last few years, technologies related to use of natural gas as an engine fuel have been developed and improved. These solutions have evolved from the prototype phase to the industrial scale implementation. The widespread availability of gaseous fuels has led to the development of a technology that allows the CNG fuel to be refueled directly from the urban gas network to the vehicle tank (ie. HYGEN - CNGHRS). Home refueling installations, although they have been known for many years, are becoming increasingly important in the present day. The major obstacle in the sale of this technology was, until recently, quite high capital expenditure compared to the later benefits. Home refueling systems allow refueling vehicle tank, with full control of fuel costs and refueling time. CNG Home Refueling Stations (such as HYGEN) allow gas value chain to overcome the dogma that there is a lack of refueling infrastructure allowing companies in gas value chain to participate in transportation market. Technology is based on one stage hydraulic compressor (instead of multistage mechanical compressor technology) which provides the possibility to compress low pressure gas from distribution gas network to 200 bar for its further usage as a fuel for NGVs. This boosts revenues and profits of gas companies by expanding its presence in higher margin of energy sector.Keywords: alternative fuels, CNG (compressed natural gas), CNG stations, NGVs (natural gas vehicles), gas value chain
Procedia PDF Downloads 2011353 Mechanism of Action of New Sustainable Flame Retardant Additives in Polyamide 6,6
Authors: I. Belyamani, M. K. Hassan, J. U. Otaigbe, W. R. Fielding, K. A. Mauritz, J. S. Wiggins, W. L. Jarrett
Abstract:
We have investigated the flame-retardant efficiency of special new phosphate glass (P-glass) compositions having different glass transition temperatures (Tg) on the processing conditions of polyamide 6,6 (PA6,6) and the final hybrid flame retardancy (FR). We have showed that the low Tg P glass composition (i.e., ILT 1) is a promising flame retardant for PA6,6 at a concentration of up to 15 wt. % compared to intermediate (IIT 3) and high (IHT 1) Tg P glasses. Cone calorimetry data showed that the ILT 1 decreased both the peak heat release rate and the total heat amount released from the PA6,6/ILT 1 hybrids, resulting in an efficient formation of a glassy char layer. These intriguing findings prompted to address several questions concerning the mechanism of action of the different P glasses studied. The general mechanism of action of phosphorous based FR additives occurs during the combustion stage by enhancing the morphology of the char and the thermal shielding effect. However, the present work shows that P glass based FR additives act during melt processing of PA6,6/P glass hybrids. Dynamic mechanical analysis (DMA) revealed that the Tg of PA6,6/ILT 1 was significantly shifted to a lower Tg (~65 oC) and another transition appeared at high temperature (~ 166 oC), thus indicating a strong interaction between PA6,6 and ILT 1. This was supported by a drop in the melting point and crystallinity of the PA6,6/ILT 1 hybrid material as detected by differential scanning calorimetry (DSC). The dielectric spectroscopic investigation of the networks’ molecular level structural variations (i.e. hybrids chain motion, Tg and sub-Tg relaxations) agreed very well with the DMA and DSC findings; it was found that the three different P glass compositions did not show any effect on the PA6,6 sub-Tg relaxations (related to the NH2 and OH chain end groups motions). Nevertheless, contrary to IIT 3 and IHT 1 based hybrids, the PA6,6/ILT 1 hybrid material showed an evidence of splitting the PA6,6 Tg relaxations into two peaks. Finally, the CPMAS 31P-NMR data confirmed the miscibility between ILT 1 and PA6,6 at the molecular level, as a much larger enhancement in cross-polarization for the PA6,6/15%ILT 1 hybrids was observed. It can be concluded that compounding low Tg P-glass (i.e. ILT 1) with PA6,6 facilitates hydrolytic chain scission of the PA6,6 macromolecules through a potential chemical interaction between phosphate and the alpha-Carbon of the amide bonds of the PA6,6, leading to better flame retardant properties.Keywords: broadband dielectric spectroscopy, composites, flame retardant, polyamide, phosphate glass, sustainable
Procedia PDF Downloads 2381352 Deterioration Prediction of Pavement Load Bearing Capacity from FWD Data
Authors: Kotaro Sasai, Daijiro Mizutani, Kiyoyuki Kaito
Abstract:
Expressways in Japan have been built in an accelerating manner since the 1960s with the aid of rapid economic growth. About 40 percent in length of expressways in Japan is now 30 years and older and has become superannuated. Time-related deterioration has therefore reached to a degree that administrators, from a standpoint of operation and maintenance, are forced to take prompt measures on a large scale aiming at repairing inner damage deep in pavements. These measures have already been performed for bridge management in Japan and are also expected to be embodied for pavement management. Thus, planning methods for the measures are increasingly demanded. Deterioration of layers around road surface such as surface course and binder course is brought about at the early stages of whole pavement deterioration process, around 10 to 30 years after construction. These layers have been repaired primarily because inner damage usually becomes significant after outer damage, and because surveys for measuring inner damage such as Falling Weight Deflectometer (FWD) survey and open-cut survey are costly and time-consuming process, which has made it difficult for administrators to focus on inner damage as much as they have been supposed to. As expressways today have serious time-related deterioration within them deriving from the long time span since they started to be used, it is obvious the idea of repairing layers deep in pavements such as base course and subgrade must be taken into consideration when planning maintenance on a large scale. This sort of maintenance requires precisely predicting degrees of deterioration as well as grasping the present situations of pavements. Methods for predicting deterioration are determined to be either mechanical or statistical. While few mechanical models have been presented, as far as the authors know of, previous studies have presented statistical methods for predicting deterioration in pavements. One describes deterioration process by estimating Markov deterioration hazard model, while another study illustrates it by estimating Proportional deterioration hazard model. Both of the studies analyze deflection data obtained from FWD surveys and present statistical methods for predicting deterioration process of layers around road surface. However, layers of base course and subgrade remain unanalyzed. In this study, data collected from FWD surveys are analyzed to predict deterioration process of layers deep in pavements in addition to surface layers by a means of estimating a deterioration hazard model using continuous indexes. This model can prevent the loss of information of data when setting rating categories in Markov deterioration hazard model when evaluating degrees of deterioration in roadbeds and subgrades. As a result of portraying continuous indexes, the model can predict deterioration in each layer of pavements and evaluate it quantitatively. Additionally, as the model can also depict probability distribution of the indexes at an arbitrary point and establish a risk control level arbitrarily, it is expected that this study will provide knowledge like life cycle cost and informative content during decision making process referring to where to do maintenance on as well as when.Keywords: deterioration hazard model, falling weight deflectometer, inner damage, load bearing capacity, pavement
Procedia PDF Downloads 3901351 Adaption of the Design Thinking Method for Production Planning in the Meat Industry Using Machine Learning Algorithms
Authors: Alica Höpken, Hergen Pargmann
Abstract:
The resource-efficient planning of the complex production planning processes in the meat industry and the reduction of food waste is a permanent challenge. The complexity of the production planning process occurs in every part of the supply chain, from agriculture to the end consumer. It arises from long and uncertain planning phases. Uncertainties such as stochastic yields, fluctuations in demand, and resource variability are part of this process. In the meat industry, waste mainly relates to incorrect storage, technical causes in production, or overproduction. The high amount of food waste along the complex supply chain in the meat industry could not be reduced by simple solutions until now. Therefore, resource-efficient production planning by conventional methods is currently only partially feasible. The realization of intelligent, automated production planning is basically possible through the application of machine learning algorithms, such as those of reinforcement learning. By applying the adapted design thinking method, machine learning methods (especially reinforcement learning algorithms) are used for the complex production planning process in the meat industry. This method represents a concretization to the application area. A resource-efficient production planning process is made available by adapting the design thinking method. In addition, the complex processes can be planned efficiently by using this method, since this standardized approach offers new possibilities in order to challenge the complexity and the high time consumption. It represents a tool to support the efficient production planning in the meat industry. This paper shows an elegant adaption of the design thinking method to apply the reinforcement learning method for a resource-efficient production planning process in the meat industry. Following, the steps that are necessary to introduce machine learning algorithms into the production planning of the food industry are determined. This is achieved based on a case study which is part of the research project ”REIF - Resource Efficient, Economic and Intelligent Food Chain” supported by the German Federal Ministry for Economic Affairs and Climate Action of Germany and the German Aerospace Center. Through this structured approach, significantly better planning results are achieved, which would be too complex or very time consuming using conventional methods.Keywords: change management, design thinking method, machine learning, meat industry, reinforcement learning, resource-efficient production planning
Procedia PDF Downloads 1281350 Designing the First Oil Tanker Shipyard Facility in Kuwait
Authors: Fatma Al Abdullah, Shahad Al Ameer, Ritaj Jaragh, Fatimah Khajah, Rawan Qambar, Amr Nounou
Abstract:
Kuwait currently manufactures its tankers in foreign countries. Oil tankers play a role in the supply chain of the oil industry. Therefore, with Kuwait’s sufficient financial resources, the country should secure itself strategically in order to protect its oil industry to sustain economic development. The purpose of this report is designing an oil tankers’ shipyard facility. Basing the shipyard facility in Kuwait will have great economic rewards. The shipbuilding industry directly enhances the industrial chain in terms of new job and business opportunities as well as educational fields. Heavy Engineering Industries & Shipbuilding Co. K.S.C. (HEISCO) was chosen as a host due to benefits that will result from HEISCO’s existing infrastructure and expertise to reduce cost. The Facility Design methodology chosen has been used because it covers all aspects needed for the report. The oil tanker market is witnessing a shift from crude tankers to product tankers. Therefore the Panamax tanker (product tanker) was selected to be manufactured in the facility. The different departments needed in shipyards were identified based on studying different global shipyards. Technologies needed to build ships helped in the process design. It was noticed that ships are engineer to order. The new layout development of the proposed shipyard is currently in progress. A feasibility study will be conducted to ensure the success of the facility after developing the shipyard’s layout.Keywords: oil tankers, shipbuilding, shipyard, facility design, Kuwait
Procedia PDF Downloads 4661349 The Effects of Prebiotic, Probiotic and Synbiotic Diets Containing Bacillus coagulans and Inulin on Serum Lipid Profile in the Rat
Authors: Khadijeh Abhari, Seyed Shahram Shekarforoush, Saeid Hosseinzadeh
Abstract:
An in vivo trial was conducted to evaluate the effects of Bacillus coagulans, and inulin, either separately or in combination, on lipid profile using a rat model. Thirty-two male Wistar rats were randomly divided into four groups (n=8) and fed as follows: standard diet (control), standard diet with 5% w/w long chain inulin (prebiotic), standard diet with 109 spores/day spores of B. coagulans by orogastric gavage (probiotic), and standard diet with 5% w/w long chain inulin and 109 spores/day of B. coagulans (synbiotic). Rats were fed the treatments for 30 days. Serum samples were collected 10, 20 and 30 days following onset of treatment. Total cholesterol, HDL and LDL cholesterol and triglycerides concentrations were analyzed. Results of this study showed that inulin potentially affected the lipid profile. An obvious decrease in serum total cholesterol and LDL-cholestrol of rats fed with inulin in synbiotic and prebiotic groups was seen in all sampling days. Inulin fed rats also demonstrated higher levels of HDL-cholesterol concentration; however this value in probiotic and control fed rats remains without significant change. According to the results of this study, B. coagulans did not contribute to any lipid profile changes after 30 days. Thus, further in vitro investigations on the characteristic of these bacteria could be useful to gain insights into understanding the treatment of probiotics in order to achieve the maximum beneficial effect.Keywords: bacillus coagulans, inulin, rat, lipid profile, synbiotic diet
Procedia PDF Downloads 4091348 Re-Entrant Direct Hexagonal Phases in a Lyotropic System Induced by Ionic Liquids
Authors: Saheli Mitra, Ramesh Karri, Praveen K. Mylapalli, Arka. B. Dey, Gourav Bhattacharya, Gouriprasanna Roy, Syed M. Kamil, Surajit Dhara, Sunil K. Sinha, Sajal K. Ghosh
Abstract:
The most well-known structures of lyotropic liquid crystalline systems are the two dimensional hexagonal phase of cylindrical micelles with a positive interfacial curvature and the lamellar phase of flat bilayers with zero interfacial curvature. In aqueous solution of surfactants, the concentration dependent phase transitions have been investigated extensively. However, instead of changing the surfactant concentrations, the local curvature of an aggregate can be altered by tuning the electrostatic interactions among the constituent molecules. Intermediate phases with non-uniform interfacial curvature are still unexplored steps to understand the route of phase transition from hexagonal to lamellar. Understanding such structural evolution in lyotropic liquid crystalline systems is important as it decides the complex rheological behavior of the system, which is one of the main interests of the soft matter industry. Sodium dodecyl sulfate (SDS) is an anionic surfactant and can be considered as a unique system to tune the electrostatics by cationic additives. In present study, imidazolium-based ionic liquids (ILs) with different number of carbon atoms in their single hydrocarbon chain were used as the additive in the aqueous solution of SDS. At a fixed concentration of total non-aqueous components (SDS and IL), the molar ratio of these components was changed, which effectively altered the electrostatic interactions between the SDS molecules. As a result, the local curvature is observed to modify, and correspondingly, the structure of the hexagonal liquid crystalline phases are transformed into other phases. Polarizing optical microscopy of SDS and imidazole-based-IL systems have exhibited different textures of the liquid crystalline phases as a function of increasing concentration of the ILs. The small angle synchrotron x-ray diffraction (SAXD) study has indicated the hexagonal phase of direct cylindrical micelles to transform to a rectangular phase at the presence of short (two hydrocarbons) chain IL. However, the hexagonal phase is transformed to a lamellar phase at the presence of long (ten hydrocarbons) chain IL. Interestingly, at the presence of a medium (four hydrocarbons) chain IL, the hexagonal phase is transformed to another hexagonal phase of direct cylindrical micelles through the lamellar phase. To the best of our knowledge, such a phase sequence has not been reported earlier. Even though the small angle x-ray diffraction study has revealed the lattice parameters of these phases to be similar to each other, their rheological behavior has been distinctly different. These rheological studies have shed lights on how these phases differ in their viscoelastic behavior. Finally, the packing parameters, calculated for these phases based on the geometry of the aggregates, have explained the formation of the self-assembled aggregates.Keywords: lyotropic liquid crystals, polarizing optical microscopy, rheology, surfactants, small angle x-ray diffraction
Procedia PDF Downloads 1381347 Detection of Arcobacter and Helicobacter pylori Contamination in Organic Vegetables by Cultural and Polymerase Chain Reaction (PCR) Methods
Authors: Miguel García-Ferrús, Ana González, María A. Ferrús
Abstract:
The most demanded organic foods worldwide are those that are consumed fresh, such as fruits and vegetables. However, there is a knowledge gap about some aspects of organic food microbiological quality and safety. Organic fruits and vegetables are more exposed to pathogenic microorganisms due to surface contact with natural fertilizers such as animal manure, wastes and vermicompost used during farming. It has been suggested that some emergent pathogens, such as Helicobacter pylori or Arcobacter spp., could reach humans through the consumption of raw or minimally processed vegetables. Therefore, the objective of this work was to study the contamination of organic fresh green leafy vegetables by Arcobacter spp. and Helicobacter pylori. For this purpose, a total of 24 vegetable samples, 13 lettuce and 11 spinach were acquired from 10 different ecological supermarkets and greengroceries and analyzed by culture and PCR. Arcobacter spp. was detected in 5 samples (20%) by PCR, 4 spinach and one lettuce. One spinach sample was found to be also positive by culture. For H. pylori, the H. pylori VacA gene-specific band was detected in 12 vegetable samples (50%), 10 lettuces and 2 spinach. Isolation in the selective medium did not yield any positive result, possibly because of low contamination levels together with the presence of the organism in its viable but non-culturable form. Results showed significant levels of H. pylori and Arcobacter contamination in organic vegetables that are generally consumed raw, which seems to confirm that these foods can act as transmission vehicles to humans.Keywords: Arcobacter sp., Helicobacter pylori, Organic Vegetables, Polymerase Chain Reaction (PCR)
Procedia PDF Downloads 1641346 Engineering Packaging for a Sustainable Food Chain
Authors: Ezekiel Olukayode Akintunde
Abstract:
There is a high level of inadequate methods at all levels of food supply in the global food industry. The inadequacies have led to vast wastages of food. Hence there is a need to curb the wastages that can later affect natural resources, water resources, and energy to avoid negative impacts on the climate and the environment. There is a need to engage multifaceted engineering packaging approaches for a sustainable food chain to ensure active packaging, intelligent packaging, new packaging materials, and a sustainable packaging system. Packaging can be regarded as an indispensable component approach that can be applied to solve major problems of sustainable food consumption globally; this is about controlling the environmental impact of packed food. The creative innovation will ensure that packaged foods are free from food-borne diseases and food chemical pollution. This paper evaluates the key shortcomings that must be addressed by innovative food packaging to ensure a safe, natural environment that will preserve energy and sustain water resources. Certain solutions, including fabricating microbial biodegradable chemical compounds/polymers from agro-food waste remnants, appear a bright path to ensure a strong and innovative waste-based food packaging system. Over the years, depletion in the petroleum reserves has brought about the emergence of biodegradable polymers as a proper replacement for traditional plastics; moreover, the increase in the production of traditional plastics has raised serious concerns about environmental threats. Biodegradable polymers have proven to be biocompatible, which can also be processed for other useful applications. Therefore, this study will showcase a workable guiding framework for designing a sustainable food packaging system that will not constitute a danger to our present society and that will surely preserve natural water resources. Various assessment methods will be deployed at different stages of the packaging design to enhance the package's sustainability. Every decision that will be made must be facilitated with methods that will be engaged per stage to allow for corrective measures throughout the cycle of the design process. Basic performance appraisal of packaging innovations. Food wastage can result in inimical environmental impacts, and ethical practices must be carried out for food loss at home. An examination in West Africa quantified preventable food wastage over the entire food value chain at almost 180kg per person per year. That is preventable food wastage, 35% of which originated at the household level. Many food losses reported, which happened at the harvesting, storage, transportation, and processing stages, are not preventable and are without much environmental impact because such wastage can be used for feeding. Other surveys have shown that 15%-20% of household food losses can be traced to food packaging. Therefore, new innovative packaging systems can lessen the environmental effect of food wastage to extend shelf‐life to lower food loss in the process distribution chain and at the household level.Keywords: food packaging, biodegradable polymer, intelligent packaging, shelf-life
Procedia PDF Downloads 571345 Analysis of Vibration and Shock Levels during Transport and Handling of Bananas within the Post-Harvest Supply Chain in Australia
Authors: Indika Fernando, Jiangang Fei, Roger Stanley, Hossein Enshaei
Abstract:
Delicate produce such as fresh fruits are increasingly susceptible to physiological damage during the essential post-harvest operations such as transport and handling. Vibration and shock during the distribution are identified factors for produce damage within post-harvest supply chains. Mechanical damages caused during transit may significantly diminish the quality of fresh produce which may also result in a substantial wastage. Bananas are one of the staple fruit crops and the most sold supermarket produce in Australia. It is also the largest horticultural industry in the state of Queensland where 95% of the total production of bananas are cultivated. This results in significantly lengthy interstate supply chains where fruits are exposed to prolonged vibration and shocks. This paper is focused on determining the shock and vibration levels experienced by packaged bananas during transit from the farm gate to the retail market. Tri-axis acceleration data were captured by custom made accelerometer based data loggers which were set to a predetermined sampling rate of 400 Hz. The devices recorded data continuously for 96 Hours in the interstate journey of nearly 3000 Km from the growing fields in far north Queensland to the central distribution centre in Melbourne in Victoria. After the bananas were ripened at the ripening facility in Melbourne, the data loggers were used to capture the transport and handling conditions from the central distribution centre to three retail outlets within the outskirts of Melbourne. The quality of bananas were assessed before and after transport at each location along the supply chain. Time series vibration and shock data were used to determine the frequency and the severity of the transient shocks experienced by the packages. Frequency spectrogram was generated to determine the dominant frequencies within each segment of the post-harvest supply chain. Root Mean Square (RMS) acceleration levels were calculated to characterise the vibration intensity during transport. Data were further analysed by Fast Fourier Transform (FFT) and the Power Spectral Density (PSD) profiles were generated to determine the critical frequency ranges. It revealed the frequency range in which the escalated energy levels were transferred to the packages. It was found that the vertical vibration was the highest and the acceleration levels mostly oscillated between ± 1g during transport. Several shock responses were recorded exceeding this range which were mostly attributed to package handling. These detrimental high impact shocks may eventually lead to mechanical damages in bananas such as impact bruising, compression bruising and neck injuries which affect their freshness and visual quality. It was revealed that the frequency range between 0-5 Hz and 15-20 Hz exert an escalated level of vibration energy to the packaged bananas which may result in abrasion damages such as scuffing, fruit rub and blackened rub. Further research is indicated specially in the identified critical frequency ranges to minimise exposure of fruits to the harmful effects of vibration. Improving the handling conditions and also further study on package failure mechanisms when exposed to transient shock excitation will be crucial to improve the visual quality of bananas within the post-harvest supply chain in Australia.Keywords: bananas, handling, post-harvest, supply chain, shocks, transport, vibration
Procedia PDF Downloads 1901344 Production of Camel Nanobodies against of Anti-Morphine-3-Glucuronide for the Development of a Biosensor for Detecting Illicit Drug
Authors: Shirin Jalili, Sadegh Hasannia, Hadi Shirzad, Afshin Khara
Abstract:
Morphine is one of the most medicinally important analgesics and narcotics. Structurally, it is classified as an alkaloid because of the presence of nitrogen. Its structure is similar to that of codeine, thebaine, and heroin. An immunoassay to accurately discriminate between these analogous alkaloids would be highly beneficial. A key factor for such an assay is specificity with high sensitivity, which is totally dependent on the antibody employed. However, most antibodies against haptens are polyclonal serum antibodies that exhibit significant cross-reactivities with closely related compounds. The camel-derived single-chain antibody fragments (VHH) are the smallest molecules with antigen-binding capacity, possessing unique properties compared to other conventional antibodies. In this study, a library containing the VHH genes of a camel immunized with with morphine conjugated BSA following phage display technology was generated. By screening the camel-derived variable region of the heavy chain cDNA phage display library with the ability to bind the desired hapten, we obtained some nanobodies that recognize this hapten. Phage display expression of the Nbs from this library and pannings against this hapten resulted in a clear enrichment of four distinct Nb-displaying phages with specificity for morphine that could be a potential target site for the development of new strategies for the development of a biosensor for detecting illicit drug.Keywords: phage display, nanobody, Morphine-3, glucuronide, ELISA, biosensor
Procedia PDF Downloads 4251343 On the Importance of Quality, Liquidity Level and Liquidity Risk: A Markov-Switching Regime Approach
Authors: Tarik Bazgour, Cedric Heuchenne, Danielle Sougne
Abstract:
We examine time variation in the market beta of portfolios sorted on quality, liquidity level and liquidity beta characteristics across stock market phases. Using US stock market data for the period 1970-2010, we find, first, the US stock market was driven by four regimes. Second, during the crisis regime, low (high) quality, high (low) liquidity beta and illiquid (liquid) stocks exhibit an increase (a decrease) in their market betas. This finding is consistent with the flight-to-quality and liquidity phenomena. Third, we document the same pattern across stocks when the market volatility is low. We argue that, during low volatility times, investors shift their portfolios towards low quality and illiquid stocks to seek portfolio gains. The pattern observed in the tranquil regime can be, therefore, explained by a flight-to-low-quality and to illiquidity. Finally, our results reveal that liquidity level is more important than liquidity beta during the crisis regime.Keywords: financial crises, quality, liquidity, liquidity risk, regime-switching models
Procedia PDF Downloads 404