Search results for: manufacturing optimization
802 Estimation of Service Quality and Its Impact on Market Share Using Business Analytics
Authors: Haritha Saranga
Abstract:
Service quality has become an important driver of competition in manufacturing industries of late, as many products are being sold in conjunction with service offerings. With increase in computational power and data capture capabilities, it has become possible to analyze and estimate various aspects of service quality at the granular level and determine their impact on business performance. In the current study context, dealer level, model-wise warranty data from one of the top two-wheeler manufacturers in India is used to estimate service quality of individual dealers and its impact on warranty related costs and sales performance. We collected primary data on warranty costs, number of complaints, monthly sales, type of quality upgrades, etc. from the two-wheeler automaker. In addition, we gathered secondary data on various regions in India, such as petrol and diesel prices, geographic and climatic conditions of various regions where the dealers are located, to control for customer usage patterns. We analyze this primary and secondary data with the help of a variety of analytics tools such as Auto-Regressive Integrated Moving Average (ARIMA), Seasonal ARIMA and ARIMAX. Study results, after controlling for a variety of factors, such as size, age, region of the dealership, and customer usage pattern, show that service quality does influence sales of the products in a significant manner. A more nuanced analysis reveals the dynamics between product quality and service quality, and how their interaction affects sales performance in the Indian two-wheeler industry context. We also provide various managerial insights using descriptive analytics and build a model that can provide sales projections using a variety of forecasting techniques.Keywords: service quality, product quality, automobile industry, business analytics, auto-regressive integrated moving average
Procedia PDF Downloads 122801 Optimizing The Residential Design Process Using Automated Technologies
Authors: Martin Georgiev, Milena Nanova, Damyan Damov
Abstract:
Architects, engineers, and developers need to analyse and implement a wide spectrum of data in different formats, if they want to produce viable residential developments. Usually, this data comes from a number of different sources and is not well structured. The main objective of this research project is to provide parametric tools working with real geodesic data that can generate residential solutions. Various codes, regulations and design constraints are described by variables and prioritized. In this way, we establish a common workflow for architects, geodesists, and other professionals involved in the building and investment process. This collaborative medium ensures that the generated design variants conform to various requirements, contributing to a more streamlined and informed decision-making process. The quantification of distinctive characteristics inherent to typical residential structures allows a systematic evaluation of the generated variants, focusing on factors crucial to designers, such as daylight simulation, circulation analysis, space utilization, view orientation, etc. Integrating real geodesic data offers a holistic view of the built environment, enhancing the accuracy and relevance of the design solutions. The use of generative algorithms and parametric models offers high productivity and flexibility of the design variants. It can be implemented in more conventional CAD and BIM workflow. Experts from different specialties can join their efforts, sharing a common digital workspace. In conclusion, our research demonstrates that a generative parametric approach based on real geodesic data and collaborative decision-making could be introduced in the early phases of the design process. This gives the designers powerful tools to explore diverse design possibilities, significantly improving the qualities of the building investment during its entire lifecycle.Keywords: architectural design, residential buildings, urban development, geodesic data, generative design, parametric models, workflow optimization
Procedia PDF Downloads 55800 NiFe-Type Catalysts for Anion Exchange Membrane (AEM) Electrolyzers
Authors: Boldin Roman, Liliana Analía Diaz
Abstract:
As the hydrogen economy continues to expand, reducing energy consumption and emissions while stimulating economic growth, the development of efficient and cost-effective hydrogen production technologies is critical. Among various methods, anion exchange membrane (AEM) water electrolysis stands out due to its potential for using non-noble metal catalysts. The exploration and enhancement of non-noble metal catalysts, such as NiFe-type catalysts, are pivotal for the advancement of AEM technology, ensuring its commercial viability and environmental sustainability. NiFe-type catalysts were synthesized through electrodeposition and characterized both electrochemically and physico-chemically. Various supports, including Ni foam and Ni mesh, were used as porous transport layers (PTLs) to evaluate the effective catalyst thickness and the influence of the PTL in a 5 cm² AEM electrolyzer. This methodological approach allows for a detailed assessment of catalyst performance under operational conditions typical of industrial hydrogen production. The study revealed that electrodeposited non-noble multi-metallic catalysts maintain stable performance as anodes in AEM water electrolysis. NiFe-type catalysts demonstrated superior activity, with the NiFeCoP alloy outperforming others by delivering the lowest overpotential and the highest current density. Furthermore, the use of different PTLs showed significant effects on the electrochemical behavior of the catalysts, indicating that PTL selection is crucial for optimizing performance and efficiency in AEM electrolyzers. Conclusion: The research underscores the potential of non-noble metal catalysts in enhancing efficiency and reducing the costs of AEM electrolysers. The findings highlight the importance of catalyst and PTL optimization in developing scalable and economically viable hydrogen production technologies. Continued innovation in this area is essential for supporting the growth of the hydrogen economy and achieving sustainable energy solutions.Keywords: AEMWE, electrocatalyst, hydrogen production, water electrolysis.
Procedia PDF Downloads 38799 Optimizing Detection Methods for THz Bio-imaging Applications
Authors: C. Bolakis, I. S. Karanasiou, D. Grbovic, G. Karunasiri, N. Uzunoglu
Abstract:
A new approach for efficient detection of THz radiation in biomedical imaging applications is proposed. A double-layered absorber consisting of a 32 nm thick aluminum (Al) metallic layer, located on a glass medium (SiO2) of 1 mm thickness, was fabricated and used to design a fine-tuned absorber through a theoretical and finite element modeling process. The results indicate that the proposed low-cost, double-layered absorber can be tuned based on the metal layer sheet resistance and the thickness of various glass media taking advantage of the diversity of the absorption of the metal films in the desired THz domain (6 to 10 THz). It was found that the composite absorber could absorb up to 86% (a percentage exceeding the 50%, previously shown to be the highest achievable when using single thin metal layer) and reflect less than 1% of the incident THz power. This approach will enable monitoring of the transmission coefficient (THz transmission ‘’fingerprint’’) of the biosample with high accuracy, while also making the proposed double-layered absorber a good candidate for a microbolometer pixel’s active element. Based on the aforementioned promising results, a more sophisticated and effective double-layered absorber is under development. The glass medium has been substituted by diluted poly-si and the results were twofold: An absorption factor of 96% was reached and high TCR properties acquired. In addition, a generalization of these results and properties over the active frequency spectrum was achieved. Specifically, through the development of a theoretical equation having as input any arbitrary frequency in the IR spectrum (0.3 to 405.4 THz) and as output the appropriate thickness of the poly-si medium, the double-layered absorber retains the ability to absorb the 96% and reflects less than 1% of the incident power. As a result, through that post-optimization process and the spread spectrum frequency adjustment, the microbolometer detector efficiency could be further improved.Keywords: bio-imaging, fine-tuned absorber, fingerprint, microbolometer
Procedia PDF Downloads 349798 Reactive Fabrics for Chemical Warfare Agent Decomposition Using Particle Crystallization
Authors: Myungkyu Park, Minkun Kim, Sunghoon Kim, Samgon Ryu
Abstract:
Recently, research for reactive fabrics which have the characteristics of CWA (Chemical Warfare Agent) decomposition is being performed actively. The performance level of decomposition for CWA decomposition in various environmental condition is one of the critical factors in applicability as protective materials for NBC (Nuclear, Biological, and Chemical) protective clothing. In this study, results of performance test for CWA decomposition by reactive fabric made of electrospinning web and reactive particle are presented. Currently, the MOF (metal organic framework) type of UiO-66-NH₂ is frequently being studied as material for decomposing CWA especially blister agent HD [Bis(2-chloroethyl) sulfide]. When we test decomposition rate with electrospinning web made of PVB (Polyvinyl Butiral) polymer and UiO-66-NH₂ particle, we can get very high protective performance than the case other particles are applied. Furthermore, if the repellant surface fabric is added on reactive material as the component of protective fabric, the performance of layer by layered reactive fabric could be approached to the level of current NBC protective fabric for HD decomposition rate. Reactive fabric we used in this study is manufactured by electrospinning process of polymer which contains the reactive particle of UiO-66-NH₂, and we performed crystalizing process once again on that polymer fiber web in solvent systems as a second step for manufacturing reactive fabric. Three kinds of polymer materials are used in this process, but PVB was most suitable as an electrospinning fiber polymer considering the shape of product. The density of particle on fiber web and HD decomposition rate is enhanced by secondary crystallization compared with the results which are not processed. The amount of HD penetration by 24hr AVLAG (Aerosol Vapor Liquid Assessment Group) swatch test through the reactive fabrics with secondary crystallization and without crystallization is 24 and 146μg/cm² respectively. Even though all of the reactive fiber webs for this test are combined with repellant surface layer at outer side of swatch, the effects of secondary crystallization of particle for the reactive fiber web are remarkable.Keywords: CWA, Chemical Warfare Agent, gas decomposition, particle growth, protective clothing, reactive fabric, swatch test
Procedia PDF Downloads 299797 Evaluation of the Impact of Reducing the Traffic Light Cycle for Cars to Improve Non-Vehicular Transportation: A Case of Study in Lima
Authors: Gheyder Concha Bendezu, Rodrigo Lescano Loli, Aldo Bravo Lizano
Abstract:
In big urbanized cities of Latin America, motor vehicles have priority over non-motor vehicles and pedestrians. There is an important problem that affects people's health and quality of life; lack of inclusion towards pedestrians makes it difficult for them to move smoothly and safely since the city has been planned for the transit of motor vehicles. Faced with the new trend for sustainable and economical transport, the city is forced to develop infrastructure in order to incorporate pedestrians and users with non-motorized vehicles in the transport system. The present research aims to study the influence of non-motorized vehicles on an avenue, the optimization of a cycle using traffic lights based on simulation in Synchro software, to improve the flow of non-motor vehicles. The evaluation is of the microscopic type; for this reason, field data was collected, such as vehicular, pedestrian, and non-motor vehicle user demand. With the values of speed and travel time, it is represented in the current scenario that contains the existing problem. These data allow to create a microsimulation model in Vissim software, later to be calibrated and validated so that it has a behavior similar to reality. The results of this model are compared with the efficiency parameters of the proposed model; these parameters are the queue length, the travel speed, and mainly the travel times of the users at this intersection. The results reflect a reduction of 27% in travel time, that is, an improvement between the proposed model and the current one for this great avenue. The tail length of motor vehicles is also reduced by 12.5%, a considerable improvement. All this represents an improvement in the level of service and in the quality of life of users.Keywords: bikeway, microsimulation, pedestrians, queue length, traffic light cycle, travel time
Procedia PDF Downloads 179796 The Use of TRIZ to Map the Evolutive Pattern of Products
Authors: Fernando C. Labouriau, Ricardo M. Naveiro
Abstract:
This paper presents a model for mapping the evolutive pattern of products in order to generate new ideas, to perceive emerging technologies and to manage product’s portfolios in new product development (NPD). According to the proposed model, the information extracted from the patent system is filtered and analyzed with TRIZ tools to produce the input information to the NPD process. The authors acknowledge that the NPD process is well integrated within the enterprises business strategic planning and that new products are vital in the competitive market nowadays. In the other hand, it has been observed the proactive use of patent information in some methodologies for selecting projects, mapping technological change and generating product concepts. And one of these methodologies is TRIZ, a theory created to favor innovation and to improve product design that provided the analytical framework for the model. Initially, it is presented an introduction to TRIZ mainly focused on the patterns of evolution of technical systems and its strategic uses, a brief and absolutely non-comprehensive description as the theory has several others tools being widely employed in technical and business applications. Then, it is introduced the model for mapping the products evolutive pattern with its three basic pillars, namely patent information, TRIZ and NPD, and the methodology for implementation. Following, a case study of a Brazilian bike manufacturing is presented to proceed the mapping of a product evolutive pattern by decomposing and analyzing one of its assemblies along ten evolution lines in order to envision opportunities for further product development. Some of these lines are illustrated in more details to evaluate the features of the product in relation to the TRIZ concepts using a comparison perspective with patents in the state of the art to validate the product’s evolutionary potential. As a result, the case study provided several opportunities for a product improvement development program in different project categories, identifying technical and business impacts as well as indicating the lines of evolution that can mostly benefit from each opportunity.Keywords: product development, patents, product strategy, systems evolution
Procedia PDF Downloads 503795 Design and Development of Engine Valve Train Wear Test Rig for the Assessment of Valve Train Tribochemistry
Authors: V. Manjunath, C. V. Chandrashekara
Abstract:
Ecosystem authority calls for the use of lubricants with less effect on the nature in terms of exhaust emission, while engine user demands more mileage per liter of fuel without any compromise on engine durability. From this viewpoint, engine manufacturers require the optimum combination of materials and lubricant additive package to minimize friction and wear in the engine components like piston, crankshaft and valve train etc. The demands are placed for requirements to operate at higher speeds, loads, temperature and for extended replacement intervals of engine oil. Besides, it is necessary to accurately predict the lubricant life or the replacement interval to prevent lubrication and valve-train components failure. Experimental tribology evaluation of new engine oils requires large amount of time and energy. Hence low cost bench test is necessary for industries and original equipment manufacturing companies (OEM) to study the performance of lubricants. The present work outlines the procedure for the design and development of a valve train wear rig (MCR) to simulate the ASTMD-6891 and to develop new engine test for Indian automobile sector to evaluate lubricants for Indian automobile market. In order to improve the lubrication between cam and follower of internal combustion engine, the influence of materials or oils viscosity and additives on the friction and wear characteristics are examined with test rig by increasing the contact load at two different revolution speed. From the experimentation following results are made obvious. Temperature, Torque, speed and wear plots are used to validate the data obtained from the newly developed multi-cam cam rig (MCR) with follower against a cast iron camshaft. Camshaft lobe wear is measured at seven different locations on cam profile. Tribofilm formed using 5W-30 oil is evaluated and correlated with the standard test results.Keywords: ASTMD-6891, multi-cam rig (MCR), 5W-30, cam-profile
Procedia PDF Downloads 178794 Strengths and Weaknesses of Tally, an LCA Tool for Comparative Analysis
Authors: Jacob Seddlemeyer, Tahar Messadi, Hongmei Gu, Mahboobeh Hemmati
Abstract:
The main purpose of this first tier of the study is to quantify and compare the embodied environmental impacts associated with alternative materials applied to Adohi Hall, a residence building at the University of Arkansas campus, Fayetteville, AR. This 200,000square foot building has5 stories builtwith mass timber and is compared to another scenario where the same edifice is built with a steel frame. Based on the defined goal and scope of the project, the materials respectivetothe respective to the two building options are compared in terms of Global Warming Potential (GWP), starting from cradle to the construction site, which includes the material manufacturing stage (raw material extract, process, supply, transport, and manufacture) plus transportation to the site (module A1-A4, based on standard EN 15804 definition). The consumedfossil fuels and emitted CO2 associated with the buildings are the major reason for the environmental impacts of climate change. In this study, GWP is primarily assessed to the exclusion of other environmental factors. The second tier of this work is to evaluate Tally’s performance in the decision-making process through the design phases, as well as determine its strengths and weaknesses. Tally is a Life Cycle Assessment (LCA) tool capable of conducting a cradle-to-grave analysis. As opposed to other software applications, Tally is specifically targeted at buildings LCA. As a peripheral application, this software tool is directly run within the core modeling application platform called Revit. This unique functionality causes Tally to stand out from other similar tools in the building sector LCA analysis. The results of this study also provide insights for making more environmentally efficient decisions in the building environment and help in the move forward to reduce Green House Gases (GHGs) emissions and GWP mitigation.Keywords: comparison, GWP, LCA, materials, tally
Procedia PDF Downloads 232793 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products
Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry
Abstract:
The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively
Procedia PDF Downloads 93792 Analysis of Cycling Accessibility on Chengdu Tianfu Greenway Based on Improved Two-Step Floating Catchment Area Method: A Case Study of Jincheng Greenway
Authors: Qin Zhu
Abstract:
Under the background of accelerating the construction of Beautiful and Livable Park City in Chengdu, the Tianfu greenway system, as an important support system for the construction of parks in the whole region, its accessibility is one of the key indicators to measure the effectiveness of the greenway construction. In recent years, cycling has become an important transportation mode for residents to go to the greenways because of its low-carbon, healthy and convenient characteristics, and the study of greenway accessibility under cycling mode can provide reference suggestions for the optimization and improvement of greenways. Taking Jincheng Greenway in Chengdu City as an example, the Baidu Map Application Programming Interface (API) and questionnaire survey was used to improve the two-step floating catchment area (2SFCA) method from the three dimensions of search threshold, supply side and demand side, to calculate the cycling accessibility of the greenway and to explore the spatial matching relationship with the population density, the number of entrances and the comprehensive attractiveness. The results show that: 1) the distribution of greenway accessibility in Jincheng shows a pattern of "high in the south and low in the north, high in the west and low in the east", 2) the spatial match between greenway accessibility and population density of the residential area is imbalanced, and there is a significant positive correlation between accessibility and the number of selectable greenway access points in residential areas, as well as the overall attractiveness of greenways, with a high degree of match. On this basis, it is proposed to give priority to the mismatch area to alleviate the contradiction between supply and demand, optimize the greenway access points to improve the traffic connection, enhance the comprehensive quality of the greenway and strengthen the service capacity, to further improve the cycling accessibility of the Jincheng Greenway and improve the spatial allocation of greenway resources.Keywords: accessibility, Baidu maps API, cycling, greenway, 2SFCA
Procedia PDF Downloads 90791 Homogenization of Cocoa Beans Fermentation to Upgrade Quality Using an Original Improved Fermenter
Authors: Aka S. Koffi, N’Goran Yao, Philippe Bastide, Denis Bruneau, Diby Kadjo
Abstract:
Cocoa beans (Theobroma cocoa L.) are the main components for chocolate manufacturing. The beans must be correctly fermented at first. Traditional process to perform the first fermentation (lactic fermentation) often consists in confining cacao beans using banana leaves or a fermentation basket, both of them leading to a poor product thermal insulation and to an inability to mix the product. Box fermenter reduces this loss by using a wood with large thickness (e>3cm), but mixing to homogenize the product is still hard to perform. Automatic fermenters are not rentable for most of producers. Heat (T>45°C) and acidity produced during the fermentation by microbiology activity of yeasts and bacteria are enabling the emergence of potential flavor and taste of future chocolate. In this study, a cylindro-rotative fermenter (FCR-V1) has been built and coconut fibers were used in its structure to confine heat. An axis of rotation (360°) has been integrated to facilitate the turning and homogenization of beans in the fermenter. This axis permits to put fermenter in a vertical position during the anaerobic alcoholic phase of fermentation, and horizontally during acetic phase to take advantage of the mid height filling. For circulation of air flow during turning in acetic phase, two woven rattan with grid have been made, one for the top and second for the bottom of the fermenter. In order to reduce air flow during acetic phase, two airtight covers are put on each grid cover. The efficiency of the turning by this kind of rotation, coupled with homogenization of the temperature, caused by the horizontal position in the acetic phase of the fermenter, contribute to having a good proportion of well-fermented beans (83.23%). In addition, beans’pH values ranged between 4.5 and 5.5. These values are ideal for enzymatic activity in the production of the aromatic compounds inside beans. The regularity of mass loss during all fermentation makes it possible to predict the drying surface corresponding to the amount being fermented.Keywords: cocoa fermentation, fermenter, microbial activity, temperature, turning
Procedia PDF Downloads 264790 Mechanical Properties of Waste Clay Brick Based Geopolymer Cured at Various Temperature
Authors: Shihab Ibrahim
Abstract:
Geopolymer binders as an alternative binder system to ordinary Portland cement are the focus of the past 2 decades of researches. In order to eliminate CO2 emission by cement manufacturing and utilizing construction waste as a source material, clean waste clay bricks which are the waste from Levent Brick factory was activated with a mixture of sodium hydroxide and sodium silicate solution. 12 molarity of sodium hydroxide solution was used and the ratio of sodium silicate to sodium hydroxide was 2.5. Alkaline solution to clay brick powder ratio of 0.35, 0.4, 0.45, and 0.5 was studied. Alkaline solution to powder ratio of 0.4 was found to be optimum ratio to have the same workability as ordinary Portland cement paste. Compressive strength of the clay brick based geopolymer paste samples was evaluated under different curing temperatures and curing durations. One day compressive strength of 57.3 MPa after curing at 85C for 24 hours was obtained which was higher than 7 days compressive strength of ordinary Portland cement paste. The highest compressive strength 71.4 MPa was achieved at seventh day age for the geopolymer paste samples cured at 85C for 24 hours. It was found that 8 hour curing at elevated temperature 85C, is sufficient to get 96% of total strength. 37.4 MPa strength at seventh day of clay brick based geopolymer sample cured at room temperature was achieved. Water absorption around 10% was found for clay brick based geopolymer samples cured at different temperatures with compare to 9.14% water absorption of ordinary Portland cement paste. The clay brick based geopolymer binder can have the potentiality to be used as an alternative binder to Portland cement in a case that the heat treatment provided. Further studies are needed in order to produce the binder in a way that can harden and gain strength without any elevated curing.Keywords: construction and demolition waste, geopolymer, clay brick, compressive strength.
Procedia PDF Downloads 262789 Study and Fine Characterization of the SS 316L Microstructures Obtained by Laser Beam Melting Process
Authors: Sebastien Relave, Christophe Desrayaud, Aurelien Vilani, Alexey Sova
Abstract:
Laser beam melting (LBM) is an additive manufacturing process that enables complex 3D parts to be designed. This process is now commonly employed for various applications such as chemistry or energy, requiring the use of stainless steel grades. LBM can offer comparable and sometimes superior mechanical properties to those of wrought materials. However, we observed an anisotropic microstructure which results from the process, caused by the very high thermal gradients along the building axis. This microstructure can be harmful depending on the application. For this reason, control and prediction of the microstructure are important to ensure the improvement and reproducibility of the mechanical properties. This study is focused on the 316L SS grade and aims at understanding the solidification and transformation mechanisms during process. Experiments to analyse the nucleation and growth of the microstructure obtained by the LBM process according to several conditions. These samples have been designed on different type of support bulk and lattice. Samples are produced on ProX DMP 200 LBM device. For the two conditions the analysis of microstructures, thanks to SEM and EBSD, revealed a single phase Austenite with preferential crystallite growth along the (100) plane. The microstructure was presented a hierarchical structure consisting columnar grains sizes in the range of 20-100 µm and sub grains structure of size 0.5 μm. These sub-grains were found in different shapes (columnar and cellular). This difference can be explained by a variation of the thermal gradient and cooling rate or element segregation while no sign of element segregation was found at the sub-grain boundaries. A high dislocation concentration was observed at sub-grain boundaries. These sub-grains are separated by very low misorientation walls ( < 2°) this causes a lattice of curvature inside large grain. A discussion is proposed on the occurrence of these microstructures formation, in regard of the LBM process conditions.Keywords: selective laser melting, stainless steel, microstructure
Procedia PDF Downloads 162788 Life Cycle Assessment of Mass Timber Structure, Construction Process as System Boundary
Authors: Mahboobeh Hemmati, Tahar Messadi, Hongmei Gu
Abstract:
Today, life cycle assessment (LCA) is a leading method in mitigating the environmental impacts emerging from the building sector. In this paper, LCA is used to quantify the Green House Gas (GHG) emissions during the construction phase of the largest mass timber residential structure in the United States, Adohi Hall. This building is a 200,000 square foot 708-bed complex located on the campus of the University of Arkansas. The energy used for buildings’ operation is the most dominant source of emissions in the building industry. Lately, however, the efforts were successful at increasing the efficiency of building operation in terms of emissions. As a result, the attention is now shifted to the embodied carbon, which is more noticeable in the building life cycle. Unfortunately, most of the studies have, however, focused on the manufacturing stage, and only a few have addressed to date the construction process. Specifically, less data is available about environmental impacts associated with the construction of mass timber. This study presents, therefore, an assessment of the environmental impact of the construction processes based on the real and newly built mass timber building mentioned above. The system boundary of this study covers modules A4 and A5 based on building LCA standard EN 15978. Module A4 includes material and equipment transportation. Module A5 covers the construction and installation process. This research evolves through 2 stages: first, to quantify materials and equipment deployed in the building, and second, to determine the embodied carbon associated with running equipment for construction materials, both transported to, and installed on, the site where the edifice is built. The Global Warming Potential (GWP) of the building is the primary metric considered in this research. The outcomes of this study bring to the front a better understanding of hotspots in terms of emission during the construction process. Moreover, the comparative analysis of the mass timber construction process with that of a theoretically similar steel building will enable an effective assessment of the environmental efficiency of mass timber.Keywords: construction process, GWP, LCA, mass timber
Procedia PDF Downloads 171787 The Role of Financial and Non-Financial Institutions in Promoting Entrepreneurship in Micro small and Medium Enterprises
Authors: Lemuel David
Abstract:
The importance of the Micro, Small, and Medium Enterprises sector is well recognized for its legitimate contribution to the Macroeconomic objectives of the Republic of Liberia, like generation of employment, input t, exports, and enhancing entrepreneurship. Right now, Medium and Small enterprises accounts for about 99 percent of the industrial units in the country, contributing 60 percent of the manufacturing sector output and approximately one-third of the nation’s exports. The role of various financial institutions like ECO bank and Non-financial Institutions like Bearch Limited support promoting the growth of Micro, Small, and Medium Enterprises is unique. A small enterprise or entrepreneur gets many types of assistance from different institutions for varied purposes in the course of his entrepreneurial journey. This paper focuses on the factors related to financial institutional support and non-financial institutional support entrepreneurs to the growth of Medium and Small enterprises in the Republic of Liberia. The significance of this paper is to support Policy and Institutional Support for Medium and Small enterprises to know the views of entrepreneurs about financial and non-financial support systems in the Republic of Liberia. This study was carried out through a survey method, with the use of questionnaires. The population for this study consisted of all registered Medium and Small enterprises which have been registered during the years 2004-2014 in the republic of Liberia. The sampling method employed for this study was a simple random technique and determined a sample size of 400. Data for the study was collected using a standard questionnaire. The questionnaire consisted of two parts: the first part consisted of questions on the profile of the respondents. The second part covers (1) financial, promotional factors and (2) non-financial promotional factors. The results of the study are based on financial and non-financial supporting activities provided by institutions to Medium and Small enterprises. After investigation, it has been found that there is no difference in the support given by Financial Institutions and non-financial Institutions. Entrepreneurs perceived “collateral-free schemes and physical infrastructure support factors are highest contributing to entry and growth of Medium and Small enterprises.Keywords: micro, small, and medium enterprises financial institutions, entrepreneurship
Procedia PDF Downloads 105786 La₀.₈Ba₀.₂FeO₃ Perovskite as an Additive in the Three-Way Catalyst (TWCs) for Reduction of PGMs Loading
Authors: Mahshid Davoodpoor, Zahra Shamohammadi Ghahsareh, Saeid Razfar, Alaleh Dabbaghi
Abstract:
Nowadays, air pollution has become a topic of great concern all over the world. One of the main sources of air pollution is automobile exhaust gas, which introduces a large number of toxic gases, including CO, unburned hydrocarbons (HCs), NOx, and non-methane hydrocarbons (NMHCs), into the air. The application of three-way catalysts (TWCs) is still the most effective strategy to mitigate the emission of these pollutants. Due to the stringent environmental regulations which continuously become stricter, studies on the TWCs are ongoing despite several years of research and development. This arises from the washcoat complexity and the several numbers of parameters involved in the redox reactions. The main objectives of these studies are the optimization of washcoat formulation and the investigation of different coating modes. Perovskite (ABO₃), as a promising class of materials, has unique features that make it versatile to use as an alternative to commonly mixed oxides in washcoats. High catalytic activity for oxidation reactions and its relatively high oxygen storage capacity are important properties of perovskites in catalytic applications. Herein, La₀.₈Ba₀.₂FeO₃ perovskite material was synthesized using the co-precipitation method and characterized by XRD, ICP, and BET analysis. The effect of synthesis conditions, including B site metal (Fe and Co), metal precursor concentration, and dopant (Ba), were examined on the phase purity of the products. The selected perovskite sample was used as one of the components in the TWC formulation to evaluate its catalytic performance through Light-off, oxygen storage capacity, and emission analysis. Results showed a remarkable increment in oxygen storage capacity and also revealed that T50 and emission of CO, HC, and NOx reduced in the presence of perovskite structure which approves the enhancement of catalytic performance for the new washcoat formulation. This study shows the brilliant future of advanced oxide structures in the TWCs.Keywords: Perovskite, three-way catalyst, PGMs, PGMs reduction
Procedia PDF Downloads 69785 Optimization of Marine Waste Collection Considering Dynamic Transport and Ship’s Wake Impact
Authors: Guillaume Richard, Sarra Zaied
Abstract:
Marine waste quantities increase more and more, 5 million tons of plastic waste enter the ocean every year. Their spatiotemporal distribution is never homogeneous and depends mainly on the hydrodynamic characteristics of the environment, as well as the size and location of the waste. As part of optimizing collect of marine plastic wastes, it is important to measure and monitor their evolution over time. In this context, diverse studies have been dedicated to describing waste behavior in order to identify its accumulation in ocean areas. None of the existing tools which track objects at sea had the objective of tracking down a slick of waste. Moreover, the applications related to marine waste are in the minority compared to rescue applications or oil slicks tracking applications. These approaches are able to accurately simulate an object's behavior over time but not during the collection mission of a waste sheet. This paper presents numerical modeling of a boat’s wake impact on the floating marine waste behavior during a collection mission. The aim is to predict the trajectory of a marine waste slick to optimize its collection using meteorological data of ocean currents, wind, and possibly waves. We have made the choice to use Ocean Parcels which is a Python library suitable for trajectoring particles in the ocean. The modeling results showed the important role of advection and diffusion processes in the spatiotemporal distribution of floating plastic litter. The performance of the proposed method was evaluated on real data collected from the Copernicus Marine Environment Monitoring Service (CMEMS). The results of the evaluation in Cape of Good Hope (South Africa) prove that the proposed approach can effectively predict the position and velocity of marine litter during collection, which allowed for optimizing time and more than $90\%$ of the amount of collected waste.Keywords: marine litter, advection-diffusion equation, sea current, numerical model
Procedia PDF Downloads 92784 Machine Learning for Exoplanetary Habitability Assessment
Authors: King Kumire, Amos Kubeka
Abstract:
The synergy of machine learning and astronomical technology advancement is giving rise to the new space age, which is pronounced by better habitability assessments. To initiate this discussion, it should be recorded for definition purposes that the symbiotic relationship between astronomy and improved computing has been code-named the Cis-Astro gateway concept. The cosmological fate of this phrase has been unashamedly plagiarized from the cis-lunar gateway template and its associated LaGrange points which act as an orbital bridge to the moon from our planet Earth. However, for this study, the scientific audience is invited to bridge toward the discovery of new habitable planets. It is imperative to state that cosmic probes of this magnitude can be utilized as the starting nodes of the astrobiological search for galactic life. This research can also assist by acting as the navigation system for future space telescope launches through the delimitation of target exoplanets. The findings and the associated platforms can be harnessed as building blocks for the modeling of climate change on planet earth. The notion that if the human genus exhausts the resources of the planet earth or there is a bug of some sort that makes the earth inhabitable for humans explains the need to find an alternative planet to inhabit. The scientific community, through interdisciplinary discussions of the International Astronautical Federation so far has the common position that engineers can reduce space mission costs by constructing a stable cis-lunar orbit infrastructure for refilling and carrying out other associated in-orbit servicing activities. Similarly, the Cis-Astro gateway can be envisaged as a budget optimization technique that models extra-solar bodies and can facilitate the scoping of future mission rendezvous. It should be registered as well that this broad and voluminous catalog of exoplanets shall be narrowed along the way using machine learning filters. The gist of this topic revolves around the indirect economic rationale of establishing a habitability scoping platform.Keywords: machine-learning, habitability, exoplanets, supercomputing
Procedia PDF Downloads 93783 Machine Learning for Exoplanetary Habitability Assessment
Authors: King Kumire, Amos Kubeka
Abstract:
The synergy of machine learning and astronomical technology advancement is giving rise to the new space age, which is pronounced by better habitability assessments. To initiate this discussion, it should be recorded for definition purposes that the symbiotic relationship between astronomy and improved computing has been code-named the Cis-Astro gateway concept. The cosmological fate of this phrase has been unashamedly plagiarized from the cis-lunar gateway template and its associated LaGrange points which act as an orbital bridge to the moon from our planet Earth. However, for this study, the scientific audience is invited to bridge toward the discovery of new habitable planets. It is imperative to state that cosmic probes of this magnitude can be utilized as the starting nodes of the astrobiological search for galactic life. This research can also assist by acting as the navigation system for future space telescope launches through the delimitation of target exoplanets. The findings and the associated platforms can be harnessed as building blocks for the modeling of climate change on planet earth. The notion that if the human genus exhausts the resources of the planet earth or there is a bug of some sort that makes the earth inhabitable for humans explains the need to find an alternative planet to inhabit. The scientific community, through interdisciplinary discussions of the International Astronautical Federation so far, has the common position that engineers can reduce space mission costs by constructing a stable cis-lunar orbit infrastructure for refilling and carrying out other associated in-orbit servicing activities. Similarly, the Cis-Astro gateway can be envisaged as a budget optimization technique that models extra-solar bodies and can facilitate the scoping of future mission rendezvous. It should be registered as well that this broad and voluminous catalog of exoplanets shall be narrowed along the way using machine learning filters. The gist of this topic revolves around the indirect economic rationale of establishing a habitability scoping platform.Keywords: exoplanets, habitability, machine-learning, supercomputing
Procedia PDF Downloads 120782 Modelling of Exothermic Reactions during Carbon Fibre Manufacturing and Coupling to Surrounding Airflow
Authors: Musa Akdere, Gunnar Seide, Thomas Gries
Abstract:
Carbon fibres are fibrous materials with a carbon atom amount of more than 90%. They combine excellent mechanicals properties with a very low density. Thus carbon fibre reinforced plastics (CFRP) are very often used in lightweight design and construction. The precursor material is usually polyacrylonitrile (PAN) based and wet-spun. During the production of carbon fibre, the precursor has to be stabilized thermally to withstand the high temperatures of up to 1500 °C which occur during carbonization. Even though carbon fibre has been used since the late 1970s in aerospace application, there is still no general method available to find the optimal production parameters and the trial-and-error approach is most often the only resolution. To have a much better insight into the process the chemical reactions during stabilization have to be analyzed particularly. Therefore, a model of the chemical reactions (cyclization, dehydration, and oxidation) based on the research of Dunham and Edie has been developed. With the presented model, it is possible to perform a complete simulation of the fibre undergoing all zones of stabilization. The fiber bundle is modeled as several circular fibers with a layer of air in-between. Two thermal mechanisms are considered to be the most important: the exothermic reactions inside the fiber and the convective heat transfer between the fiber and the air. The exothermic reactions inside the fibers are modeled as a heat source. Differential scanning calorimetry measurements have been performed to estimate the amount of heat of the reactions. To shorten the required time of a simulation, the number of fibers is decreased by similitude theory. Experiments were conducted to validate the simulation results of the fibre temperature during stabilization. The experiments for the validation were conducted on a pilot scale stabilization oven. To measure the fibre bundle temperature, a new measuring method is developed. The comparison of the results shows that the developed simulation model gives good approximations for the temperature profile of the fibre bundle during the stabilization process.Keywords: carbon fibre, coupled simulation, exothermic reactions, fibre-air-interface
Procedia PDF Downloads 279781 Evaluation of Bucket Utility Truck In-Use Driving Performance and Electrified Power Take-Off Operation
Authors: Robert Prohaska, Arnaud Konan, Kenneth Kelly, Adam Ragatz, Adam Duran
Abstract:
In an effort to evaluate the in-use performance of electrified Power Take-off (PTO) usage on bucket utility trucks operating under real-world conditions, data from 20 medium- and heavy-duty vehicles operating in California, USA were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team. In this paper, duty-cycle statistical analyses of class 5, medium-duty quick response trucks and class 8, heavy-duty material handler trucks are performed to examine and characterize vehicle dynamics trends and relationships based on collected in-use field data. With more than 100,000 kilometers of driving data collected over 880+ operating days, researchers have developed a robust methodology for identifying PTO operation from in-field vehicle data. Researchers apply this unique methodology to evaluate the performance and utilization of the conventional and electric PTO systems. Researchers also created custom representative drive-cycles for each vehicle configuration and performed modeling and simulation activities to evaluate the potential fuel and emissions savings for hybridization of the tractive driveline on these vehicles. The results of these analyses statistically and objectively define the vehicle dynamic and kinematic requirements for each vehicle configuration as well as show the potential for further system optimization through driveline hybridization. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relates specifically to medium- and heavy-duty utility vehicles operating under real-world conditions.Keywords: drive cycle, heavy-duty (HD), hybrid, medium-duty (MD), PTO, utility
Procedia PDF Downloads 400780 A Review of Current Research and Future Directions on Foodborne Illness and Food Safety: Understanding the Risks and Mitigation Strategies
Authors: Tuji Jemal Ahmed
Abstract:
This paper is to provides a comprehensive review of current research works on foodborne illness and food safety, including the risks associated with foodborne illnesses, the latest research on food safety, and the mitigation strategies used to prevent and control foodborne illnesses. Foodborne illness is a major public health concern that affects millions of people every year. As foodborne illnesses have grown more common and dangerous in recent years, it is vital that we research and build upon methods to ensure food remains safe throughout consumption. Additionally, this paper will discuss future directions for food safety research, including emerging technologies, changes in regulations and standards, and collaborative efforts to improve food safety. The first section of the paper provides an overview of the risks of foodborne illness, including a definition of foodborne illness, the causes of foodborne illness, the types of foodborne illnesses, and high-risk foods for foodborne illness, Health Consequences of Foodborne Illness. The second section of the paper focuses on current research on food safety, including the role of regulatory agencies in food safety, food safety standards and guidelines, emerging food safety concerns, and advances in food safety technology. The third section of the paper explores mitigation strategies for foodborne illness, including preventative measures, hazard analysis and critical control points (HACCP), good manufacturing practices (GMPs), and training and education. Finally, this paper examines future directions for food safety research, including hurdle technologies and their impact on food safety, changes in food safety regulations and standards, collaborative efforts to improve food safety, and research gaps and areas for further exploration. In general, this work provides a comprehensive review of current research and future directions in food safety and understanding the risks associated with foodborne illness. The implications of the assessment for food safety and public health are discussed, as well as recommended for research scholars.Keywords: food safety, foodborne illness, technologies, mitigation
Procedia PDF Downloads 111779 Efficient Energy Extraction Circuit for Impact Harvesting from High Impedance Sources
Authors: Sherif Keddis, Mohamed Azzam, Norbert Schwesinger
Abstract:
Harvesting mechanical energy from footsteps or other impacts is a possibility to enable wireless autonomous sensor nodes. These can be used for a highly efficient control of connected devices such as lights, security systems, air conditioning systems or other smart home applications. They can also be used for accurate location or occupancy monitoring. Converting the mechanical energy into useful electrical energy can be achieved using the piezoelectric effect offering simple harvesting setups and low deflections. The challenge facing piezoelectric transducers is the achievable amount of energy per impact in the lower mJ range and the management of such low energies. Simple setups for energy extraction such as a full wave bridge connected directly to a capacitor are problematic due to the mismatch between high impedance sources and low impedance storage elements. Efficient energy circuits for piezoelectric harvesters are commonly designed for vibration harvesters and require periodic input energies with predictable frequencies. Due to the sporadic nature of impact harvesters, such circuits are not well suited. This paper presents a self-powered circuit that avoids the impedance mismatch during energy extraction by disconnecting the load until the source reaches its charge peak. The switch is implemented with passive components and works independent from the input frequency. Therefore, this circuit is suited for impact harvesting and sporadic inputs. For the same input energy, this circuit stores 150% of the energy in comparison to a directly connected capacitor to a bridge rectifier. The total efficiency, defined as the ratio of stored energy on a capacitor to available energy measured across a matched resistive load, is 63%. Although the resulting energy is already sufficient to power certain autonomous applications, further optimization of the circuit are still under investigation in order to improve the overall efficiency.Keywords: autonomous sensors, circuit design, energy harvesting, energy management, impact harvester, piezoelectricity
Procedia PDF Downloads 157778 Study of University Course Scheduling for Crowd Gathering Risk Prevention and Control in the Context of Routine Epidemic Prevention
Authors: Yuzhen Hu, Sirui Wang
Abstract:
As a training base for intellectual talents, universities have a large number of students. Teaching is a primary activity in universities, and during the teaching process, a large number of people gather both inside and outside the teaching buildings, posing a strong risk of close contact. The class schedule is the fundamental basis for teaching activities in universities and plays a crucial role in the management of teaching order. Different class schedules can lead to varying degrees of indoor gatherings and trajectories of class attendees. In recent years, highly contagious diseases have frequently occurred worldwide, and how to reduce the risk of infection has always been a hot issue related to public safety. "Reducing gatherings" is one of the core measures in epidemic prevention and control, and it can be controlled through scientific scheduling in specific environments. Therefore, the scientific prevention and control goal can be achieved by considering the reduction of the risk of excessive gathering of people during the course schedule arrangement. Firstly, we address the issue of personnel gathering in various pathways on campus, with the goal of minimizing congestion and maximizing teaching effectiveness, establishing a nonlinear mathematical model. Next, we design an improved genetic algorithm, incorporating real-time evacuation operations based on tracking search and multidimensional positive gradient cross-mutation operations, considering the characteristics of outdoor crowd evacuation. Finally, we apply undergraduate course data from a university in Harbin to conduct a case study. It compares and analyzes the effects of algorithm improvement and optimization of gathering situations and explores the impact of path blocking on the degree of gathering of individuals on other pathways.Keywords: the university timetabling problem, risk prevention, genetic algorithm, risk control
Procedia PDF Downloads 97777 Screening and Optimization of Conditions for Pectinase Production by Aspergillus Flavus
Authors: Rumaisa Shahid, Saad Aziz Durrani, Shameel Pervez, Ibatsam Khokhar
Abstract:
Food waste is a prevalent issue in Pakistan, with over 40 percent of food discarded annually. Despite their decay, rotting fruits retain residual nutritional value consumed by microorganisms, notably fungi and bacteria. Fungi, preferred for their extracellular enzyme release, are gaining prominence, particularly for pectinase production. This enzyme offers several advantages, including clarifying juices by breaking down pectic compounds. In this study, three Aspergillus flavus isolates derived from decomposed fruits and manure were selected for pectinase production. The primary aim was to isolate fungi from diverse waste sources, identify the isolates and assess their capacity for pectinase production. The identification was done through morphological characteristics with the help of Light microscopy and Scanning Electron Microscopy (SEM). Pectinolytic potential was screened using pectin minimal salt agar (PMSA) medium, comparing clear zone diameters among isolates. Identification relied on morphological characteristics. Optimizing substrate (lemon and orange peel powder) concentrations, pH, temperature, and incubation period aimed to enhance pectinase yield. Spectrophotometry enabled quantitative analysis. The temperature was set at room temperature (28 ºC). The optimal conditions for Aspergillus flavus strain AF1(isolated from mango) included a pH of 5, an incubation period of 120 hours, and substrate concentrations of 3.3% for orange peels and 6.6% for lemon peels. For AF2 and AF3 (both isolated from soil), the ideal pH and incubation period were the same as AF1 i.e. pH 5 and 120 hours. However, their optimized substrate concentrations varied, with AF2 showing maximum activity at 3.3% for orange peels and 6.6% for lemon peels, while AF3 exhibited its peak activity at 6.6% for orange peels and 8.3% for lemon peels. Among the isolates, AF1 demonstrated superior performance under these conditions, comparatively.Keywords: pectinase, lemon peel, orange peel, aspergillus flavus
Procedia PDF Downloads 77776 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education
Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue
Abstract:
In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education
Procedia PDF Downloads 112775 Achieving Process Stability through Automation and Process Optimization at H Blast Furnace Tata Steel, Jamshedpur
Authors: Krishnendu Mukhopadhyay, Subhashis Kundu, Mayank Tiwari, Sameeran Pani, Padmapal, Uttam Singh
Abstract:
Blast Furnace is a counter current process where burden descends from top and hot gases ascend from bottom and chemically reduce iron oxides into liquid hot metal. One of the major problems of blast furnace operation is the erratic burden descent inside furnace. Sometimes this problem is so acute that burden descent stops resulting in Hanging and instability of the furnace. This problem is very frequent in blast furnaces worldwide and results in huge production losses. This situation becomes more adverse when blast furnaces are operated at low coke rate and high coal injection rate with adverse raw materials like high alumina ore and high coke ash. For last three years, H-Blast Furnace Tata Steel was able to reduce coke rate from 450 kg/thm to 350 kg/thm with an increase in coal injection to 200 kg/thm which are close to world benchmarks and expand profitability. To sustain this regime, elimination of irregularities of blast furnace like hanging, channeling, and scaffolding is very essential. In this paper, sustaining of zero hanging spell for consecutive three years with low coke rate operation by improvement in burden characteristics, burden distribution, changes in slag regime, casting practices and adequate automation of the furnace operation has been illustrated. Models have been created to comprehend and upgrade the blast furnace process understanding. A model has been developed to predict the process of maintaining slag viscosity in desired range to attain proper burden permeability. A channeling prediction model has also been developed to understand channeling symptoms so that early actions can be initiated. The models have helped to a great extent in standardizing the control decisions of operators at H-Blast Furnace of Tata Steel, Jamshedpur and thus achieving process stability for last three years.Keywords: hanging, channelling, blast furnace, coke
Procedia PDF Downloads 197774 Identification of Vehicle Dynamic Parameters by Using Optimized Exciting Trajectory on 3- DOF Parallel Manipulator
Authors: Di Yao, Gunther Prokop, Kay Buttner
Abstract:
Dynamic parameters, including the center of gravity, mass and inertia moments of vehicle, play an essential role in vehicle simulation, collision test and real-time control of vehicle active systems. To identify the important vehicle dynamic parameters, a systematic parameter identification procedure is studied in this work. In the first step of the procedure, a conceptual parallel manipulator (virtual test rig), which possesses three rotational degrees-of-freedom, is firstly proposed. To realize kinematic characteristics of the conceptual parallel manipulator, the kinematic analysis consists of inverse kinematic and singularity architecture is carried out. Based on the Euler's rotation equations for rigid body dynamics, the dynamic model of parallel manipulator and derivation of measurement matrix for parameter identification are presented subsequently. In order to reduce the sensitivity of parameter identification to measurement noise and other unexpected disturbances, a parameter optimization process of searching for optimal exciting trajectory of parallel manipulator is conducted in the following section. For this purpose, the 321-Euler-angles defined by parameterized finite-Fourier-series are primarily used to describe the general exciting trajectory of parallel manipulator. To minimize the condition number of measurement matrix for achieving better parameter identification accuracy, the unknown coefficients of parameterized finite-Fourier-series are estimated by employing an iterative algorithm based on MATLAB®. Meanwhile, the iterative algorithm will ensure the parallel manipulator still keeps in an achievable working status during the execution of optimal exciting trajectory. It is showed that the proposed procedure and methods in this work can effectively identify the vehicle dynamic parameters and could be an important application of parallel manipulator in the fields of parameter identification and test rig development.Keywords: parameter identification, parallel manipulator, singularity architecture, dynamic modelling, exciting trajectory
Procedia PDF Downloads 270773 Research on Evaluation of Renewable Energy Technology Innovation Strategy Based on PMC Index Model
Abstract:
Renewable energy technology innovation is an important way to realize the energy transformation. Our government has issued a series of policies to guide and support the development of renewable energy. The implementation of these policies will affect the further development, utilization and technological innovation of renewable energy. In this context, it is of great significance to systematically sort out and evaluate the renewable energy technology innovation policy for improving the existing policy system. Taking the 190 renewable energy technology innovation policies issued during 2005-2021 as a sample, from the perspectives of policy issuing departments and policy keywords, it uses text mining and content analysis methods to analyze the current situation of the policies and conduct a semantic network analysis to identify the core issuing departments and core policy topic words; A PMC (Policy Modeling Consistency) index model is built to quantitatively evaluate the selected policies, analyze the overall pros and cons of the policy through its PMC index, and reflect the PMC value of the model's secondary index The core departments publish policies and the performance of each dimension of the policies related to the core topic headings. The research results show that Renewable energy technology innovation policies focus on synergy between multiple departments, while the distribution of the issuers is uneven in terms of promulgation time; policies related to different topics have their own emphasis in terms of policy types, fields, functions, and support measures, but It still needs to be improved, such as the lack of policy forecasting and supervision functions, the lack of attention to product promotion, and the relatively single support measures. Finally, this research puts forward policy optimization suggestions in terms of promoting joint policy release, strengthening policy coherence and timeliness, enhancing the comprehensiveness of policy functions, and enriching incentive measures for renewable energy technology innovation.Keywords: renewable energy technology innovation, content analysis, policy evaluation, PMC index model
Procedia PDF Downloads 68