Search results for: dissipative technologies
2767 DNA Fingerprinting of Some Major Genera of Subterranean Termites (Isoptera) (Anacanthotermes, Psammotermes and Microtermes) from Western Saudi Arabia
Authors: AbdelRahman A. Faragalla, Mohamed H. Alqhtani, Mohamed M. M.Ahmed
Abstract:
Saudi Arabia has currently been beset by a barrage of bizarre assemblages of subterranean termite fauna, inflicting heavy catastrophic havocs on human valued properties in various homes, storage facilities, warehouses, agricultural and horticultural crops including okra, sweet pepper, tomatoes, sorghum, date palm trees, citruses and many forest domains and green lush desert oases. The most pressing urgent priority is to use modern technologies to alleviate the painstaking obstacle of taxonomic identification of these injurious noxious pests that might lead to effective pest control in both infested agricultural commodities and field crops. Our study has indicated the use of DNA fingerprinting technologies, in order to generate basic information of the genetic similarity between 3 predominant families containing the most destructive termite species. The methodologies included extraction and DNA isolation from members of the major families and the use of randomly selected primers and PCR amplifications with the nucleotide sequences. GC content and annealing temperatures for all primers, PCR amplifications and agarose gel electrophoresis were also conducted in addition to the scoring and analysis of Random Amplification Polymorphic DNA-PCR (RAPDs). A phylogenetic analysis for different species using statistical computer program on the basis of RAPD-DNA results, represented as a dendrogram based on the average of band sharing ratio between different species. Our study aims to shed more light on this intriguing subject, which may lead to an expedited display of the kinship and relatedness of species in an ambitious undertaking to arrive at correct taxonomic classification of termite species, discover sibling species, so that a logistic rational pest management strategy could be delineated.Keywords: DNA fingerprinting, Western Saudi Arabia, DNA primers, RAPD
Procedia PDF Downloads 4302766 Sustainable Membranes Based on 2D Materials for H₂ Separation and Purification
Authors: Juan A. G. Carrio, Prasad Talluri, Sergio G. Echeverrigaray, Antonio H. Castro Neto
Abstract:
Hydrogen as a fuel and environmentally pleasant energy carrier is part of this transition towards low-carbon systems. The extensive deployment of hydrogen production, purification and transport infrastructures still represents significant challenges. Independent of the production process, the hydrogen generally is mixed with light hydrocarbons and other undesirable gases that need to be removed to obtain H₂ with the required purity for end applications. In this context, membranes are one of the simplest, most attractive, sustainable, and performant technologies enabling hydrogen separation and purification. They demonstrate high separation efficiencies and low energy consumption levels in operation, which is a significant leap compared to current energy-intensive options technologies. The unique characteristics of 2D laminates have given rise to a diversity of research on their potential applications in separation systems. Specifically, it is already known in the scientific literature that graphene oxide-based membranes present the highest reported selectivity of H₂ over other gases. This work explores the potential of a new type of 2D materials-based membranes in separating H₂ from CO₂ and CH₄. We have developed nanostructured composites based on 2D materials that have been applied in the fabrication of membranes to maximise H₂ selectivity and permeability, for different gas mixtures, by adjusting the membranes' characteristics. Our proprietary technology does not depend on specific porous substrates, which allows its integration in diverse separation modules with different geometries and configurations, looking to address the technical performance required for industrial applications and economic viability. The tuning and precise control of the processing parameters allowed us to control the thicknesses of the membranes below 100 nanometres to provide high permeabilities. Our results for the selectivity of new nanostructured 2D materials-based membranes are in the range of the performance reported in the available literature around 2D materials (such as graphene oxide) applied to hydrogen purification, which validates their use as one of the most promising next-generation hydrogen separation and purification solutions.Keywords: membranes, 2D materials, hydrogen purification, nanocomposites
Procedia PDF Downloads 1342765 Domestic Led Lighting Designs Using Internet of Things
Authors: Gouresh Singhal, Rajib Kumar Panigrahi
Abstract:
In this paper, we try to examine historical and technological changes in lighting industry. We propose a (proto) technical solution at block diagram and circuit level. Untapped and upcoming technologies such as Cloud and 6LoWPAN are further explored. The paper presents a robust hardware realistic design. A mobile application is also provided to provide last mile user interface. The paper highlights the current challenges to be faced and concludes with a pragmatic view of lighting industry.Keywords: 6lowpan, internet of things, mobile application, led
Procedia PDF Downloads 5712764 Efficient Backup Protection for Hybrid WDM/TDM GPON System
Authors: Elmahdi Mohammadine, Ahouzi Esmail, Najid Abdellah
Abstract:
This contribution aims to present a new protected hybrid WDM/TDM PON architecture using Wavelength Selective Switches and Optical Line Protection devices. The objective from using these technologies is to improve flexibility and enhance the protection of GPON networks.Keywords: Wavlenght Division Multiplexed Passive Optical Network (WDM-PON), Time Division Multiplexed PON (TDM-PON), architecture, Protection, Wavelength Selective Switches (WSS), Optical Line Protection (OLP)
Procedia PDF Downloads 5422763 The Attitudinal Effects of Dental Hygiene Students When Changing Conventional Practices of Preventive Therapy in the Dental Hygiene Curriculum
Authors: Shawna Staud, Mary Kaye Scaramucci
Abstract:
Objective: Rubber cup polishing has been a traditional method of preventative therapy in dental hygiene treatment. Newer methods such as air polishing have changed the way dental hygiene care is provided, yet this technique has not been embraced by students in the program nor by practitioners in the workforce. Students entering the workforce tend to follow office protocol and are limited in confidence to introduce technologies learned in the curriculum. This project was designed to help students gain confidence in newer skills and encourage private practice settings to adopt newer technologies for patient care. Our program recently introduced air polishing earlier in the program before the rubber cup technique to determine if students would embrace the technology to become leading-edge professionals when they enter the marketplace. Methods: The class of 2022 was taught the traditional method of polishing in the first-year curriculum and air polishing in the second-year curriculum. The class of 2023 will be taught the air polishing method in the first-year curriculum and the traditional method of polishing in the second-year curriculum. Pre- and post-graduation survey data will be collected from both cohorts. Descriptive statistics and pre and post-paired t-tests with alpha set at .05 to compare pre and post-survey results will be used to assess data. Results: This study is currently in progress, with a completion date of October 2023. The class of 2022 completed the pre-graduation survey in the spring of 2022. The post-gradation survey will be sent out in October 2022. The class of 2023 cohort will be surveyed in the spring of 2023 and October 2023. Conclusion: Our hypothesis is students who are taught air polishing first will be more inclined to adopt that skill in private practice, thereby embracing newer technology and improving oral health care.Keywords: luggage handling system at world’s largest pilgrimage center
Procedia PDF Downloads 1032762 A 500 MWₑ Coal-Fired Power Plant Operated under Partial Oxy-Combustion: Methodology and Economic Evaluation
Authors: Fernando Vega, Esmeralda Portillo, Sara Camino, Benito Navarrete, Elena Montavez
Abstract:
The European Union aims at strongly reducing their CO₂ emissions from energy and industrial sector by 2030. The energy sector contributes with more than two-thirds of the CO₂ emission share derived from anthropogenic activities. Although efforts are mainly focused on the use of renewables by energy production sector, carbon capture and storage (CCS) remains as a frontline option to reduce CO₂ emissions from industrial process, particularly from fossil-fuel power plants and cement production. Among the most feasible and near-to-market CCS technologies, namely post-combustion and oxy-combustion, partial oxy-combustion is a novel concept that can potentially reduce the overall energy requirements of the CO₂ capture process. This technology consists in the use of higher oxygen content in the oxidizer that should increase the CO₂ concentration of the flue gas once the fuel is burnt. The CO₂ is then separated from the flue gas downstream by means of a conventional CO₂ chemical absorption process. The production of a higher CO₂ concentrated flue gas should enhance the CO₂ absorption into the solvent, leading to further reductions of the CO₂ separation performance in terms of solvent flow-rate, equipment size, and energy penalty related to the solvent regeneration. This work evaluates a portfolio of CCS technologies applied to fossil-fuel power plants. For this purpose, an economic evaluation methodology was developed in detail to determine the main economical parameters for CO₂ emission removal such as the levelized cost of electricity (LCOE) and the CO₂ captured and avoided costs. ASPEN Plus™ software was used to simulate the main units of power plant and solve the energy and mass balance. Capital and investment costs were determined from the purchased cost of equipment, also engineering costs and project and process contingencies. The annual capital cost and operating and maintenance costs were later obtained. A complete energy balance was performed to determine the net power produced in each case. The baseline case consists of a supercritical 500 MWe coal-fired power plant using anthracite as a fuel without any CO₂ capture system. Four cases were proposed: conventional post-combustion capture, oxy-combustion and partial oxy-combustion using two levels of oxygen-enriched air (40%v/v and 75%v/v). CO₂ chemical absorption process using monoethanolamine (MEA) was used as a CO₂ separation process whereas the O₂ requirement was achieved using a conventional air separation unit (ASU) based on Linde's cryogenic process. Results showed a reduction of 15% of the total investment cost of the CO₂ separation process when partial oxy-combustion was used. Oxygen-enriched air production also reduced almost half the investment costs required for ASU in comparison with oxy-combustion cases. Partial oxy-combustion has a significant impact on the performance of both CO₂ separation and O₂ production technologies, and it can lead to further energy reductions using new developments on both CO₂ and O₂ separation processes.Keywords: carbon capture, cost methodology, economic evaluation, partial oxy-combustion
Procedia PDF Downloads 1472761 Good Practices for Model Structure Development and Managing Structural Uncertainty in Decision Making
Authors: Hossein Afzali
Abstract:
Increasingly, decision analytic models are used to inform decisions about whether or not to publicly fund new health technologies. It is well noted that the accuracy of model predictions is strongly influenced by the appropriateness of model structuring. However, there is relatively inadequate methodological guidance surrounding this issue in guidelines developed by national funding bodies such as the Australian Pharmaceutical Benefits Advisory Committee (PBAC) and The National Institute for Health and Care Excellence (NICE) in the UK. This presentation aims to discuss issues around model structuring within decision making with a focus on (1) the need for a transparent and evidence-based model structuring process to inform the most appropriate set of structural aspects as the base case analysis; (2) the need to characterise structural uncertainty (If there exist alternative plausible structural assumptions (or judgements), there is a need to appropriately characterise the related structural uncertainty). The presentation will provide an opportunity to share ideas and experiences on how the guidelines developed by national funding bodies address the above issues and identify areas for further improvements. First, a review and analysis of the literature and guidelines developed by PBAC and NICE will be provided. Then, it will be discussed how the issues around model structuring (including structural uncertainty) are not handled and justified in a systematic way within the decision-making process, its potential impact on the quality of public funding decisions, and how it should be presented in submissions to national funding bodies. This presentation represents a contribution to the good modelling practice within the decision-making process. Although the presentation focuses on the PBAC and NICE guidelines, the discussion can be applied more widely to many other national funding bodies that use economic evaluation to inform funding decisions but do not transparently address model structuring issues e.g. the Medical Services Advisory Committee (MSAC) in Australia or the Canadian Agency for Drugs and Technologies in Health.Keywords: decision-making process, economic evaluation, good modelling practice, structural uncertainty
Procedia PDF Downloads 1862760 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines
Authors: Alexander Guzman Urbina, Atsushi Aoyama
Abstract:
The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.Keywords: deep learning, risk assessment, neuro fuzzy, pipelines
Procedia PDF Downloads 2922759 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)
Procedia PDF Downloads 202758 Fault Tolerant and Testable Designs of Reversible Sequential Building Blocks
Authors: Vishal Pareek, Shubham Gupta, Sushil Chandra Jain
Abstract:
With increasing high-speed computation demand the power consumption, heat dissipation and chip size issues are posing challenges for logic design with conventional technologies. Recovery of bit loss and bit errors is other issues that require reversibility and fault tolerance in the computation. The reversible computing is emerging as an alternative to conventional technologies to overcome the above problems and helpful in a diverse area such as low-power design, nanotechnology, quantum computing. Bit loss issue can be solved through unique input-output mapping which require reversibility and bit error issue require the capability of fault tolerance in design. In order to incorporate reversibility a number of combinational reversible logic based circuits have been developed. However, very few sequential reversible circuits have been reported in the literature. To make the circuit fault tolerant, a number of fault model and test approaches have been proposed for reversible logic. In this paper, we have attempted to incorporate fault tolerance in sequential reversible building blocks such as D flip-flop, T flip-flop, JK flip-flop, R-S flip-flop, Master-Slave D flip-flop, and double edge triggered D flip-flop by making them parity preserving. The importance of this proposed work lies in the fact that it provides the design of reversible sequential circuits completely testable for any stuck-at fault and single bit fault. In our opinion our design of reversible building blocks is superior to existing designs in term of quantum cost, hardware complexity, constant input, garbage output, number of gates and design of online testable D flip-flop have been proposed for the first time. We hope our work can be extended for building complex reversible sequential circuits.Keywords: parity preserving gate, quantum computing, fault tolerance, flip-flop, sequential reversible logic
Procedia PDF Downloads 5452757 Opportunities and Challenges for Decarbonizing Steel Production by Creating Markets for ‘Green Steel’ Products
Authors: Hasan Muslemani, Xi Liang, Kathi Kaesehage, Francisco Ascui, Jeffrey Wilson
Abstract:
The creation of a market for lower-carbon steel products, here called ‘green steel’, has been identified as an important means to support the introduction of breakthrough emission reduction technologies into the steel sector. However, the definition of what ‘green’ entails in the context of steel production, the implications on the competitiveness of green steel products in local and international markets, and the necessary market mechanisms to support their successful market penetration remain poorly explored. This paper addresses this gap by holding semi-structured interviews with international sustainability experts and commercial managers from leading steel trade associations, research institutes and steelmakers. Our findings show that there is an urgent need to establish a set of standards to define what ‘greenness’ means in the steelmaking context; standards that avoid market disruptions, unintended consequences, and opportunities for greenwashing. We also highlight that the introduction of green steel products will have implications on product competitiveness on three different levels: 1) between primary and secondary steelmaking routes, 2) with traditional, lesser green steel, and 3) with other substitutable materials (e.g. cement and plastics). This paper emphasises the need for steelmakers to adopt a transitional approach in deploying different low-carbon technologies, based on their stage of technological maturity, applicability in certain country contexts, capacity to reduce emissions over time, and the ability of the investment community to support their deployment. We further identify market mechanisms to support green steel production, including carbon border adjustments and public procurement, highlighting a need for implementing a combination of complementary policies to ensure the products’ roll-out. The study further shows that the auto industry is a likely candidate for green steel consumption, where a market would be supported by price premiums paid by willing consumers, such as those of high-end luxury vehicles.Keywords: green steel, decarbonisation, business model innovation, market analysis
Procedia PDF Downloads 1332756 Human Factors Interventions for Risk and Reliability Management of Defence Systems
Authors: Chitra Rajagopal, Indra Deo Kumar, Ila Chauhan, Ruchi Joshi, Binoy Bhargavan
Abstract:
Reliability and safety are essential for the success of mission-critical and safety-critical defense systems. Humans are part of the entire life cycle of defense systems development and deployment. The majority of industrial accidents or disasters are attributed to human errors. Therefore, considerations of human performance and human reliability are critical in all complex systems, including defense systems. Defense systems are operating from the ground, naval and aerial platforms in diverse conditions impose unique physical and psychological challenges to the human operators. Some of the safety and mission-critical defense systems with human-machine interactions are fighter planes, submarines, warships, combat vehicles, aerial and naval platforms based missiles, etc. Human roles and responsibilities are also going through a transition due to the infusion of artificial intelligence and cyber technologies. Human operators, not accustomed to such challenges, are more likely to commit errors, which may lead to accidents or loss events. In such a scenario, it is imperative to understand the human factors in defense systems for better systems performance, safety, and cost-effectiveness. A case study using Task Analysis (TA) based methodology for assessment and reduction of human errors in the Air and Missile Defense System in the context of emerging technologies were presented. Action-oriented task analysis techniques such as Hierarchical Task Analysis (HTA) and Operator Action Event Tree (OAET) along with Critical Action and Decision Event Tree (CADET) for cognitive task analysis was used. Human factors assessment based on the task analysis helps in realizing safe and reliable defense systems. These techniques helped in the identification of human errors during different phases of Air and Missile Defence operations, leading to meet the requirement of a safe, reliable and cost-effective mission.Keywords: defence systems, reliability, risk, safety
Procedia PDF Downloads 1352755 Digital Transformation in Education: Artificial Intelligence Awareness of Preschool Teachers
Authors: Cansu Bozer, Saadet İrem Turgut
Abstract:
Artificial intelligence (AI) has become one of the most important technologies of the digital age and is transforming many sectors, including education. The advantages offered by AI, such as automation, personalised learning, and data analytics, create new opportunities for both teachers and students in education systems. Preschool education plays a fundamental role in the cognitive, social, and emotional development of children. In this period, the foundations of children's creative thinking, problem-solving, and critical thinking skills are laid. Educational technologies, especially artificial intelligence-based applications, are thought to contribute to the development of these skills. For example, artificial intelligence-supported digital learning tools can support learning processes by offering activities that can be customised according to the individual needs of each child. However, the successful use of artificial intelligence-based applications in preschool education can be realised under the guidance of teachers who have the right knowledge about this technology. Therefore, it is of great importance to measure preschool teachers' awareness levels of artificial intelligence and to understand which variables affect this awareness. The aim of this study is to measure preschool teachers' awareness levels of artificial intelligence and to determine which factors are related to this awareness. In line with this purpose, teachers' level of knowledge about artificial intelligence, their thoughts about the role of artificial intelligence in education, and their attitudes towards artificial intelligence will be evaluated. The study will be conducted with 100 teachers working in Turkey using a descriptive survey model. In this context, ‘Artificial Intelligence Awareness Level Scale for Teachers’ developed by Ferikoğlu and Akgün (2022) will be used. The collected data will be analysed using SPSS (Statistical Package for the Social Sciences) software. Descriptive statistics (frequency, percentage, mean, standard deviation) and relationship analyses (correlation and regression analyses) will be used in data analysis. As a result of the study, the level of artificial intelligence awareness of preschool teachers will be determined, and the factors affecting this awareness will be identified. The findings obtained will contribute to the determination of studies that can be done to increase artificial intelligence awareness in preschool education.Keywords: education, child development, artificial intelligence, preschool teachers
Procedia PDF Downloads 192754 Investigation of Delivery of Triple Play Data in GE-PON Fiber to the Home Network
Authors: Ashima Anurag Sharma
Abstract:
Optical fiber based networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This research paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparison between various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be decreases due to increase in bit error rate.Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT
Procedia PDF Downloads 5272753 Investigation of Delivery of Triple Play Services
Authors: Paramjit Mahey, Monica Sharma, Jasbinder Singh
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT
Procedia PDF Downloads 5412752 Safe and Scalable Framework for Participation of Nodes in Smart Grid Networks in a P2P Exchange of Short-Term Products
Authors: Maciej Jedrzejczyk, Karolina Marzantowicz
Abstract:
Traditional utility value chain is being transformed during last few years into unbundled markets. Increased distributed generation of energy is one of considerable challenges faced by Smart Grid networks. New sources of energy introduce volatile demand response which has a considerable impact on traditional middlemen in E&U market. The purpose of this research is to search for ways to allow near-real-time electricity markets to transact with surplus energy based on accurate time synchronous measurements. A proposed framework evaluates the use of secure peer-2-peer (P2P) communication and distributed transaction ledgers to provide flat hierarchy, and allow real-time insights into present and forecasted grid operations, as well as state and health of the network. An objective is to achieve dynamic grid operations with more efficient resource usage, higher security of supply and longer grid infrastructure life cycle. Methods used for this study are based on comparative analysis of different distributed ledger technologies in terms of scalability, transaction performance, pluggability with external data sources, data transparency, privacy, end-to-end security and adaptability to various market topologies. An intended output of this research is a design of a framework for safer, more efficient and scalable Smart Grid network which is bridging a gap between traditional components of the energy network and individual energy producers. Results of this study are ready for detailed measurement testing, a likely follow-up in separate studies. New platforms for Smart Grid achieving measurable efficiencies will allow for development of new types of Grid KPI, multi-smart grid branches, markets, and businesses.Keywords: autonomous agents, Distributed computing, distributed ledger technologies, large scale systems, micro grids, peer-to-peer networks, Self-organization, self-stabilization, smart grids
Procedia PDF Downloads 3002751 Policy Views of Sustainable Integrated Solution for Increased Synergy between Light Railways and Electrical Distribution Network
Authors: Mansoureh Zangiabadi, Shamil Velji, Rajendra Kelkar, Neal Wade, Volker Pickert
Abstract:
The EU has set itself a long-term goal of reducing greenhouse gas emissions by 80-95% of the 1990 levels by 2050 as set in the Energy Roadmap 2050. This paper reports on the European Union H2020 funded E-Lobster project which demonstrates tools and technologies, software and hardware in integrating the grid distribution, and the railway power systems with power electronics technologies (Smart Soft Open Point - sSOP) and local energy storage. In this context this paper describes the existing policies and regulatory frameworks of the energy market at European level with a special focus then at National level, on the countries where the members of the consortium are located, and where the demonstration activities will be implemented. By taking into account the disciplinary approach of E-Lobster, the main policy areas investigated includes electricity, energy market, energy efficiency, transport and smart cities. Energy storage will play a key role in enabling the EU to develop a low-carbon electricity system. In recent years, Energy Storage System (ESSs) are gaining importance due to emerging applications, especially electrification of the transportation sector and grid integration of volatile renewables. The need for storage systems led to ESS technologies performance improvements and significant price decline. This allows for opening a new market where ESSs can be a reliable and economical solution. One such emerging market for ESS is R+G management which will be investigated and demonstrated within E-Lobster project. The surplus of energy in one type of power system (e.g., due to metro braking) might be directly transferred to the other power system (or vice versa). However, it would usually happen at unfavourable instances when the recipient does not need additional power. Thus, the role of ESS is to enhance advantages coming from interconnection of the railway power systems and distribution grids by offering additional energy buffer. Consequently, the surplus/deficit of energy in, e.g. railway power systems, is not to be immediately transferred to/from the distribution grid but it could be stored and used when it is really needed. This will assure better energy management exchange between the railway power systems and distribution grids and lead to more efficient loss reduction. In this framework, to identify the existing policies and regulatory frameworks is crucial for the project activities and for the future development of business models for the E-Lobster solutions. The projections carried out by the European Commission, the Member States and stakeholders and their analysis indicated some trends, challenges, opportunities and structural changes needed to design the policy measures to provide the appropriate framework for investors. This study will be used as reference for the discussion in the envisaged workshops with stakeholders (DSOs and Transport Managers) in the E-Lobster project.Keywords: light railway, electrical distribution network, Electrical Energy Storage, policy
Procedia PDF Downloads 1352750 Human-Automation Interaction in Law: Mapping Legal Decisions and Judgments, Cognitive Processes, and Automation Levels
Authors: Dovile Petkeviciute-Barysiene
Abstract:
Legal technologies not only create new ways for accessing and providing legal services but also transform the role of legal practitioners. Both lawyers and users of legal services expect automated solutions to outperform people with objectivity and impartiality. Although fairness of the automated decisions is crucial, research on assessing various characteristics of automated processes related to the perceived fairness has only begun. One of the major obstacles to this research is the lack of comprehensive understanding of what legal actions are automated and could be meaningfully automated, and to what extent. Neither public nor legal practitioners oftentimes cannot envision technological input due to the lack of general without illustrative examples. The aim of this study is to map decision making stages and automation levels which are and/or could be achieved in legal actions related to pre-trial and trial processes. Major legal decisions and judgments are identified during the consultations with legal practitioners. The dual-process model of information processing is used to describe cognitive processes taking place while making legal decisions and judgments during pre-trial and trial action. Some of the existing legal technologies are incorporated into the analysis as well. Several published automation level taxonomies are considered because none of them fit well into the legal context, as they were all created for avionics, teleoperation, unmanned aerial vehicles, etc. From the information processing perspective, analysis of the legal decisions and judgments expose situations that are most sensitive to cognitive bias, among others, also help to identify areas that would benefit from the automation the most. Automation level analysis, in turn, provides a systematic approach to interaction and cooperation between humans and algorithms. Moreover, an integrated map of legal decisions and judgments, information processing characteristics, and automation levels all together provide some groundwork for the research of legal technology perceived fairness and acceptance. Acknowledgment: This project has received funding from European Social Fund (project No 09.3.3-LMT-K-712-19-0116) under grant agreement with the Research Council of Lithuania (LMTLT).Keywords: automation levels, information processing, legal judgment and decision making, legal technology
Procedia PDF Downloads 1422749 Microbial Bioproduction with Design of Metabolism and Enzyme Engineering
Authors: Tomokazu Shirai, Akihiko Kondo
Abstract:
Technologies of metabolic engineering or synthetic biology are essential for effective microbial bioproduction. It is especially important to develop an in silico tool for designing a metabolic pathway producing an unnatural and valuable chemical such as fossil materials of fuel or plastics. We here demonstrated two in silico tools for designing novel metabolic pathways: BioProV and HyMeP. Furthermore, we succeeded in creating an artificial metabolic pathway by enzyme engineering.Keywords: bioinformatics, metabolic engineering, synthetic biology, genome scale model
Procedia PDF Downloads 3392748 Metagenomic analysis of Irish cattle faecal samples using Oxford Nanopore MinION Next Generation Sequencing
Authors: Niamh Higgins, Dawn Howard
Abstract:
The Irish agri-food sector is of major importance to Ireland’s manufacturing sector and to the Irish economy through employment and the exporting of animal products worldwide. Infectious diseases and parasites have an impact on farm animal health causing profitability and productivity to be affected. For the sustainability of Irish dairy farming, there must be the highest standard of animal health. There can be a lack of information in accounting for > 1% of complete microbial diversity in an environment. There is the tendency of culture-based methods of microbial identification to overestimate the prevalence of species which grow easily on an agar surface. There is a need for new technologies to address these issues to assist with animal health. Metagenomic approaches provide information on both the whole genome and transcriptome present through DNA sequencing of total DNA from environmental samples producing high determination of functional and taxonomic information. Nanopore Next Generation Technologies have the ability to be powerful sequencing technologies. They provide high throughput, low material requirements and produce ultra-long reads, simplifying the experimental process. The aim of this study is to use a metagenomics approach to analyze dairy cattle faecal samples using the Oxford Nanopore MinION Next Generation Sequencer and to establish an in-house pipeline for metagenomic characterization of complex samples. Faecal samples will be obtained from Irish dairy farms, DNA extracted and the MinION will be used for sequencing, followed by bioinformatics analysis. Of particular interest, will be the parasite Buxtonella sulcata, which there has been little research on and which there is no research on its presence on Irish dairy farms. Preliminary results have shown the ability of the MinION to produce hundreds of reads in a relatively short time frame of eight hours. The faecal samples were obtained from 90 dairy cows on a Galway farm. The results from Oxford Nanopore ‘What’s in my pot’ (WIMP) using the Epi2me workflow, show that from a total of 926 classified reads, 87% were from the Kingdom Bacteria, 10% were from the Kingdom Eukaryota, 3% were from the Kingdom Archaea and < 1% were from the Kingdom Viruses. The most prevalent bacteria were those from the Genus Acholeplasma (71 reads), Bacteroides (35 reads), Clostridium (33 reads), Acinetobacter (20 reads). The most prevalent species present were those from the Genus Acholeplasma and included Acholeplasma laidlawii (39 reads) and Acholeplasma brassicae (26 reads). The preliminary results show the ability of the MinION for the identification of microorganisms to species level coming from a complex sample. With ongoing optimization of the pipe-line, the number of classified reads are likely to increase. Metagenomics has the potential in animal health for diagnostics of microorganisms present on farms. This would support wprevention rather than a cure approach as is outlined in the DAFMs National Farmed Animal Health Strategy 2017-2022.Keywords: animal health, buxtonella sulcata, infectious disease, irish dairy cattle, metagenomics, minION, next generation sequencing
Procedia PDF Downloads 1502747 Humanizing Industrial Architecture: When Form Meets Function and Emotion
Authors: Sahar Majed Asad
Abstract:
Industrial structures have historically focused on functionality and efficiency, often disregarding aesthetics and human experience. However, a new approach is emerging that prioritizes humanizing industrial architecture and creating spaces that promote well-being, sustainability, and social responsibility. This study explores the motivations and design strategies behind this shift towards more human-centered industrial environments, providing practical guidance for architects, designers, and other stakeholders interested in incorporating these principles into their work. Through in-depth interviews with architects, designers, and industry experts, as well as a review of relevant literature, this study uncovers the reasons for this change in industrial design. The findings reveal that this shift is driven by a desire to create environments that prioritize the needs and experiences of the people who use them. The study identifies strategies such as incorporating natural elements, flexible design, and advanced technologies as crucial in achieving human-centric industrial design. It also emphasizes that effective communication and collaboration among stakeholders are crucial for successful human-centered design outcomes. This paper provides a comprehensive analysis of the motivations and design strategies behind the humanization of industrial architecture. It begins by examining the history of industrial architecture and highlights the focus on functionality and efficiency. The paper then explores the emergence of human-centered design principles in industrial architecture, discussing the benefits of this approach, including creating more sustainable and socially responsible environments.The paper explains specific design strategies that prioritize the human experience of industrial spaces. It outlines how incorporating natural elements like greenery and natural lighting can create more visually appealing and comfortable environments for industrial workers. Flexible design solutions, such as movable walls and modular furniture, can make spaces more adaptable to changing needs and promote a sense of ownership and creativity among workers. Advanced technologies, such as sensors and automation, can improve the efficiency and safety of industrial spaces while also enhancing the human experience. To provide practical guidance, the paper offers recommendations for incorporating human-centered design principles into industrial structures. It emphasizes the importance of understanding the needs and experiences of the people who use these spaces and provides specific examples of how natural elements, flexible design, and advanced technologies can be incorporated into industrial structures to promote human well-being. In conclusion, this study demonstrates that the humanization of industrial architecture is a growing trend that offers tremendous potential for creating more sustainable and socially responsible built environments. By prioritizing the human experience of industrial spaces, designers can create environments that promote well-being, sustainability, and social responsibility. This research study provides practical guidance for architects, designers, and other stakeholders interested in incorporating human-centered design principles into their work, demonstrating that a human-centered approach can lead to functional and aesthetically pleasing industrial spaces that promote human well-being and contribute to a better future for all.Keywords: human-centered design, industrial architecture, sustainability, social responsibility
Procedia PDF Downloads 1612746 Technical Games Using ICT as a Preparation for Teaching about Technology in Pre-School Age
Authors: Pavlína Částková, Jiří Kropáč, Jan Kubrický
Abstract:
The paper deals with the current issue of Information and Communication Technologies and their implementation into the educational activities of preschool children. The issue is addressed in the context of technical education and the specifics of its implementation in a kindergarten. One of the main topics of this paper is a technical game activity of a preschool child, and its possibilities, benefits and risks. The paper presents games/toys as one of the means of exploring and understanding technology as an essential part of human culture.Keywords: ICT, technical education, pre-school age, technical games
Procedia PDF Downloads 4342745 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method
Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek
Abstract:
Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow
Procedia PDF Downloads 1332744 Technologies of Factory Farming: An Exploration of Ongoing Confrontations with Farm Animal Sanctuaries
Authors: Chetna Khandelwal
Abstract:
This research aims to study the contentions that Farm Animal Sanctuaries pose to human-animal relationships in modernity, which have developed as a result of globalisation of the meat industry and advancements in technology. The sociological history of human-animal relationships in farming is contextualised in order to set a foundation for the follow-up examination of challenges to existing human-(farm)animal relationships by Farm Animal Sanctuaries. The methodology was influenced by relativism, and the method involved three semi-structured small-group interviews, conducted at locations of sanctuaries. The sample was chosen through purposive sampling and varied by location and size of the sanctuary. Data collected were transcribed and qualitatively coded to generate themes. Findings revealed that sanctuary contentions to established human-animal relationships by factory farming could be divided into 4 broad categories – Revealing horrors of factory farming (involving uncovering power relations in agribusiness); transforming relationships with animals (including letting them emotionally heal in accordance with their individual personalities and treating them as partial-pets); educating the public regarding welfare conditions in factory farms as well as animal sentience through practical experience or positive imagery of farm animals, and addressing retaliation made by agribusiness in the form of technologies or discursive strategies. Hence, this research concludes that The human-animal relationship in current times has been characterised by – (ideological and physical) distance from farm animals, commodification due to increased chasing of profits over welfare and exploitation using technological advancements, creating unequal power dynamics that rid animals of any agency. Challenges to this relationship can be influenced by local populations around the sanctuary but not so dependent upon the size of it. This research can benefit from further academic exploration into farm animal sanctuaries and their role in feminist animal rights activism to enrich the ongoing fight against intensive farming.Keywords: animal rights, factory farming, farm animal sanctuaries, human-animal relationships
Procedia PDF Downloads 1372743 Investment Development Path and Motivations for Foreign Direct Investment in Georgia
Authors: Vakhtang Charaia, Mariam Lashkhi
Abstract:
Foreign direct investment (FDI) plays a vital role in global business. It provides firms with new markets and advertising channels, cheaper production facilities, admission to new technology, products, skills and financing. FDI can provide a recipient country/company with a source of new technologies, capital, practice, products, management skills, and as such can be a powerful drive for economic development. It is one of the key elements of stable economic development in many countries, especially in developing ones. Therefore the size of FDI inflow is one of the most crustal factors for economic perfection in small economy countries (like, Georgia), while most of developed ones are net exporters of FDI. Since, FDI provides firms with new markets; admission to new technologies, products and management skills; marketing channels; cheaper production facilities, and financing opportunities. It plays a significant role in Georgian economic development. Increasing FDI inflows from all over the world to Georgia in last decade was achieved with the outstanding reforms managed by the Georgian government. However, such important phenomenon as world financial crisis and Georgian-Russian war put its consequence on the over amount of FDI inflow in Georgia in the last years. It is important to mention that the biggest investor region for Georgia is EU, which is interested in Georgia not only from the economic points of view but from political. The case studies from main EU investor countries show that Georgia has a big potential of investment in different areas, such as; financial sector, energy, construction, tourism industry, transport and communications. Moreover, signing of Association Agreement between Georgia and EU will further boost all the fields of economy in Georgia in both short and long terms. It will attract more investments from different countries and especially from EU. The last, but not least important issue is the calculation of annual FDI inflow to Georgia, which it is calculated differently by different organizations, based on different methodologies, but what is more important is that all of them show significant increase of FDI in last decade, which gives a positive signal to investors and underlines necessity of further improvement of investment climate in the same direction.Keywords: foreign direct investment (FDI), Georgia, investment development path, investment climate
Procedia PDF Downloads 2802742 An Ethnographic Study of Commercial Surrogacy Industry in India
Authors: Dalia Bhattacharjee
Abstract:
Motherhood as an institution is considered as sacred. Reproduction and motherhood have always been a concern of the private space of home. However, with the emergence of technologies like the Assisted Reproductive Technologies (ARTs), this intimate area has moved into the public. A woman can now become a mother with artificial insemination done by expert medical professionals in a hospital. With this development, the meanings of motherhood and childrearing have altered. Mothers have been divided into ‘ovarian mothers’ (those who provide the eggs), ‘uterine mothers’ (those who carry out the pregnancy and give birth), and ‘social mothers’ (those who raise the child). Thus, the ART business deconstructs motherhood by defining who the biological mother is and who the social mother is and who – despite contributing parts or processes of her body to the life of the child is not a mother, but merely the donor of a product, be it the egg or the womb, which is owned by those who are favoured by the contract. The industry of commercial surrogacy in India has been estimated to be of $2.3 billion as of 2012. There are many women who work as surrogate mothers in this industry for the exchange of money. It runs like a full-fledged business guided by a highly profit oriented capitalist market. The reproductive labourers are identified as mere womb renters or victims and not as active agents in such arrangements. Such a discourse undercuts the agency exercised by the women. The present study is an ethnography into the commercial surrogacy industry in India. This journey furthers the understanding of the dilemmas faced by the reproductive labourers. The paper emphasizes on the experiences of reproduction and motherhood outside the private space of the home in the commercial surrogacy industry in India, and, argues that this multiplicity of experiences need much focus and attention, where, the consumer becomes ‘the’ citizen and the women workers continue to be victims. The study draws on the narratives of the reproductive labourers, who remain at the center, and yet, at the periphery of such arrangements. This feminist ethnography is informed by the feminist standpoint theory to account for and analyse these varied experiences which further the understanding of the dilemmas faced by the reproductive labourers.Keywords: commercial surrogacy, ethnography, motherhood, standpoint theory
Procedia PDF Downloads 2402741 Experimental Study on Thermomechanical Properties of New-Generation ODS Alloys
Authors: O. Khalaj, B. Mašek, H. Jirková, J. Svoboda
Abstract:
By using a combination of new technologies together with an unconventional use of different types of materials, specific mechanical properties and structures of the material can be achieved. Some possibilities are enabled by a combination of powder metallurgy in the preparation of a metal matrix with dispersed stable particles achieved by mechanical alloying and hot consolidation. This paper explains the thermomechanical properties of new generation of Oxide Dispersion Strengthened alloys (ODS) within three ranges of temperature with specified deformation profiles. The results show that the mechanical properties of new ODS alloys are significantly affected by the thermomechanical treatment.Keywords: hot forming, ODS, alloys, thermomechanical, Fe-Al, Al2O3
Procedia PDF Downloads 2802740 Nanotechnology in Construction as a Building Security
Authors: Hanan Fayez Hussein
Abstract:
‘Due to increasing environmental challenges and security problems in the world such as global warming, storms, and terrorism’, humans have discovered new technologies and new materials in order to program daily life. As providing physical and psychological security is one of the primary functions of architecture, so in order to provide security, building must prevents unauthorized entry and harm to occupant and reduce the threat of attack by making building less attractive targets by new technologies such as; Nanotechnology, which has emerged as a major science and technology focus of the 21st century and will be the next industrial revolution. Nanotechnology is control of the properties of matter, and it deals with structures of the size 100 nanometers or smaller in at least one dimension and has wide application in various fields. The construction and architecture sectors were among the first to be identified as a promising application area for nanotechnology. The advantages of using nanomaterials in construction are enormous, and promises heighten building security by utilizing the strength of building materials to make our buildings more secure and get smart home. Access barriers such as wall and windows could incorporate stronger materials benefiting from nano-reinforcement utilizing nanotubes and nano composites to act as protective cover. Carbon nanotubes, as one of nanotechnology application, can be designed up to 250 times stronger than steel. Nano-enabled devices and materials offer both enhanced and, in some cases, completely new defence systems. In the addition, the small amount of carbon nanoparticles to the construction materials such as; cement, concrete, wood, glass, gypson, and steel can make these materials act as defence elements. This paper highlights the fact that nanotechnology can impact the future global security and how building’s envelop can act as a defensive cover for the building and can be resistance to any threats can attack it. Then focus on its effect on construction materials such as; Concrete can obtain by nanoadditives excellent mechanical, chemical, and physical properties with less material, which can acts as a precautionary shield to the building.Keywords: nanomaterial, global warming, building security, smart homes
Procedia PDF Downloads 822739 Energy Trading for Cooperative Microgrids with Renewable Energy Resources
Authors: Ziaullah, Shah Wahab Ali
Abstract:
Micro-grid equipped with heterogeneous energy resources present the idea of small scale distributed energy management (DEM). DEM helps in minimizing the transmission and operation costs, power management and peak load demands. Micro-grids are collections of small, independent controllable power-generating units and renewable energy resources. Micro-grids also motivate to enable active customer participation by giving accessibility of real-time information and control to the customer. The capability of fast restoration against faulty situation, integration of renewable energy resources and Information and Communication Technologies (ICT) make micro-grid as an ideal system for distributed power systems. Micro-grids can have a bank of energy storage devices. The energy management system of micro-grid can perform real-time energy forecasting of renewable resources, energy storage elements and controllable loads in making proper short-term scheduling to minimize total operating costs. We present a review of existing micro-grids optimization objectives/goals, constraints, solution approaches and tools used in micro-grids for energy management. Cost-benefit analysis of micro-grid reveals that cooperation among different micro-grids can play a vital role in the reduction of import energy cost and system stability. Cooperative micro-grids energy trading is an approach to electrical distribution energy resources that allows local energy demands more control over the optimization of power resources and uses. Cooperation among different micro-grids brings the interconnectivity and power trading issues. According to the literature, it shows that open area of research is available for cooperative micro-grids energy trading. In this paper, we proposed and formulated the efficient energy management/trading module for interconnected micro-grids. It is believed that this research will open new directions in future for energy trading in cooperative micro-grids/interconnected micro-grids.Keywords: distributed energy management, information and communication technologies, microgrid, energy management
Procedia PDF Downloads 3752738 Technico-Economical Study of a Rapeseed Based Biorefinery Using High Voltage Electrical Discharges and Ultrasounds as Pretreatment Technologies
Authors: Marwa Brahim, Nicolas Brosse, Nadia Boussetta, Nabil Grimi, Eugene Vorobiev
Abstract:
Rapeseed plant is an established product in France which is mainly dedicated to oil production. However, the economic potential of residues from this industry (rapeseed hulls, rapeseed cake, rapeseed straw etc.), has not been fully exploited. Currently, only low-grade applications are found in the market. As a consequence, it was deemed of interest to develop a technological platform aiming to convert rapeseed residues into value- added products. Specifically, a focus is given on the conversion of rapeseed straw into valuable molecules (e.g. lignin, glucose). Existing pretreatment technologies have many drawbacks mainly the production of sugar degradation products that limit the effectiveness of saccharification and fermentation steps in the overall scheme of the lignocellulosic biorefinery. In addition, the viability of fractionation strategies is a challenge in an environmental context increasingly standardized. Hence, the need to find cleaner alternatives with comparable efficiency by implementing physical phenomena that could destabilize the structural integrity of biomass without necessarily using chemical solvents. To meet environmental standards increasingly stringent, the present work aims to study the new pretreatment strategies involving lower consumption of chemicals with an attenuation of the severity of the treatment. These strategies consist on coupling physical treatments either high voltage electrical discharges or ultrasounds to conventional chemical pretreatments (soda and organosolv). Ultrasounds treatment is based on the cavitation phenomenon, and high voltage electrical discharges cause an electrical breakdown accompanied by many secondary phenomena. The choice of process was based on a technological feasibility study taking into account the economic profitability of the whole chain after products valorization. Priority was given to sugars valorization into bioethanol and lignin sale.Keywords: high voltage electrical discharges, organosolv, pretreatment strategies, rapeseed straw, soda, ultrasounds
Procedia PDF Downloads 362