Search results for: 5G technology
1615 The Impact of Agricultural Product Export on Income and Employment in Thai Economy
Authors: Anucha Wittayakorn-Puripunpinyoo
Abstract:
The research objectives were 1) to study the situation and its trend of agricultural product export of Thailand 2) to study the impact of agricultural product export on income of Thai economy 3) the impact of agricultural product export on employment of Thai economy and 4) to find out the recommendations of agricultural product export policy of Thailand. In this research, secondary data were collected as yearly time series data from 1990 to 2016 accounted for 27 years. Data were collected from the Bank of Thailand database. Primary data were collected from the steakholders of agricultural product export policy of Thailand. Data analysis was applied descriptive statistics such as arithmetic mean, standard deviation. The forecasting of agricultural product was applied Mote Carlo Simulation technique as well as time trend analysis. In addition, the impact of agricultural product export on income and employment by applying econometric model while the estimated parameters were utilized the ordinary least square technique. The research results revealed that 1) agricultural product export value of Thailand from 1990 to 2016 was 338,959.5 Million Thai baht with its growth rate of 4.984 percent yearly, in addition, the forecasting of agricultural product export value of Thailand has increased but its growth rate has been declined 2) the impact of agricultural product export has positive impact on income in Thai economy, increasing in agricultural product export of Thailand by 1 percent would lead income increased by 0.0051 percent 3) the impact of agricultural product export has positive impact on employment in Thai economy, increasing in agricultural product export of Thailand by 1 percent would lead income increased by 0.079 percent and 4) in the future, agricultural product export policy would focused on finished or semi-finished agricultural product instead of raw material by applying technology and innovation in to make value added of agricultural product export. The public agricultural product export policy would support exporters in private sector in order to encourage them as agricultural exporters in Thailand.Keywords: agricultural product export, income, employment, Thai economy
Procedia PDF Downloads 3101614 A Convolution Neural Network Approach to Predict Pes-Planus Using Plantar Pressure Mapping Images
Authors: Adel Khorramrouz, Monireh Ahmadi Bani, Ehsan Norouzi, Morvarid Lalenoor
Abstract:
Background: Plantar pressure distribution measurement has been used for a long time to assess foot disorders. Plantar pressure is an important component affecting the foot and ankle function and Changes in plantar pressure distribution could indicate various foot and ankle disorders. Morphologic and mechanical properties of the foot may be important factors affecting the plantar pressure distribution. Accurate and early measurement may help to reduce the prevalence of pes planus. With recent developments in technology, new techniques such as machine learning have been used to assist clinicians in predicting patients with foot disorders. Significance of the study: This study proposes a neural network learning-based flat foot classification methodology using static foot pressure distribution. Methodologies: Data were collected from 895 patients who were referred to a foot clinic due to foot disorders. Patients with pes planus were labeled by an experienced physician based on clinical examination. Then all subjects (with and without pes planus) were evaluated for static plantar pressures distribution. Patients who were diagnosed with the flat foot in both feet were included in the study. In the next step, the leg length was normalized and the network was trained for plantar pressure mapping images. Findings: From a total of 895 image data, 581 were labeled as pes planus. A computational neural network (CNN) ran to evaluate the performance of the proposed model. The prediction accuracy of the basic CNN-based model was performed and the prediction model was derived through the proposed methodology. In the basic CNN model, the training accuracy was 79.14%, and the test accuracy was 72.09%. Conclusion: This model can be easily and simply used by patients with pes planus and doctors to predict the classification of pes planus and prescreen for possible musculoskeletal disorders related to this condition. However, more models need to be considered and compared for higher accuracy.Keywords: foot disorder, machine learning, neural network, pes planus
Procedia PDF Downloads 3641613 Size Optimization of Microfluidic Polymerase Chain Reaction Devices Using COMSOL
Authors: Foteini Zagklavara, Peter Jimack, Nikil Kapur, Ozz Querin, Harvey Thompson
Abstract:
The invention and development of the Polymerase Chain Reaction (PCR) technology have revolutionised molecular biology and molecular diagnostics. There is an urgent need to optimise their performance of those devices while reducing the total construction and operation costs. The present study proposes a CFD-enabled optimisation methodology for continuous flow (CF) PCR devices with serpentine-channel structure, which enables the trade-offs between competing objectives of DNA amplification efficiency and pressure drop to be explored. This is achieved by using a surrogate-enabled optimisation approach accounting for the geometrical features of a CF μPCR device by performing a series of simulations at a relatively small number of Design of Experiments (DoE) points, with the use of COMSOL Multiphysics 5.4. The values of the objectives are extracted from the CFD solutions, and response surfaces created using the polyharmonic splines and neural networks. After creating the respective response surfaces, genetic algorithm, and a multi-level coordinate search optimisation function are used to locate the optimum design parameters. Both optimisation methods produced similar results for both the neural network and the polyharmonic spline response surfaces. The results indicate that there is the possibility of improving the DNA efficiency by ∼2% in one PCR cycle when doubling the width of the microchannel to 400 μm while maintaining the height at the value of the original design (50μm). Moreover, the increase in the width of the serpentine microchannel is combined with a decrease in its total length in order to obtain the same residence times in all the simulations, resulting in a smaller total substrate volume (32.94% decrease). A multi-objective optimisation is also performed with the use of a Pareto Front plot. Such knowledge will enable designers to maximise the amount of DNA amplified or to minimise the time taken throughout thermal cycling in such devices.Keywords: PCR, optimisation, microfluidics, COMSOL
Procedia PDF Downloads 1621612 Semantic Search Engine Based on Query Expansion with Google Ranking and Similarity Measures
Authors: Ahmad Shahin, Fadi Chakik, Walid Moudani
Abstract:
Our study is about elaborating a potential solution for a search engine that involves semantic technology to retrieve information and display it significantly. Semantic search engines are not used widely over the web as the majorities are still in Beta stage or under construction. Many problems face the current applications in semantic search, the major problem is to analyze and calculate the meaning of query in order to retrieve relevant information. Another problem is the ontology based index and its updates. Ranking results according to concept meaning and its relation with query is another challenge. In this paper, we are offering a light meta-engine (QESM) which uses Google search, and therefore Google’s index, with some adaptations to its returned results by adding multi-query expansion. The mission was to find a reliable ranking algorithm that involves semantics and uses concepts and meanings to rank results. At the beginning, the engine finds synonyms of each query term entered by the user based on a lexical database. Then, query expansion is applied to generate different semantically analogous sentences. These are generated randomly by combining the found synonyms and the original query terms. Our model suggests the use of semantic similarity measures between two sentences. Practically, we used this method to calculate semantic similarity between each query and the description of each page’s content generated by Google. The generated sentences are sent to Google engine one by one, and ranked again all together with the adapted ranking method (QESM). Finally, our system will place Google pages with higher similarities on the top of the results. We have conducted experimentations with 6 different queries. We have observed that most ranked results with QESM were altered with Google’s original generated pages. With our experimented queries, QESM generates frequently better accuracy than Google. In some worst cases, it behaves like Google.Keywords: semantic search engine, Google indexing, query expansion, similarity measures
Procedia PDF Downloads 4261611 Mobile Application Interventions in Positive Psychology: Current Status and Recommendations for Effective App Design
Authors: Gus Salazar, Jeremy Bekker, Lauren Linford, Jared Warren
Abstract:
Positive psychology practices allow for its principles to be applied to all people, regardless of their current level of functioning. To increase the dissemination of these practices, interventions are being adapted for use with digital technology, such as mobile apps. However, the research regarding positive psychology mobile app interventions is still in its infancy. In an effort to facilitate progress in this important area, we 1) conducted a qualitative review to summarize the current state of the positive psychology mobile app literature and 2) developed research-supported recommendations for positive psychology app development to maximize behavior change. In our literature review, we found that while positive psychology apps varied widely in content and purpose, there was a near-complete lack of research supporting their effectiveness. Most apps provided no rationale for the behavioral change techniques (BCTs) they employed in their app, and most did not develop their app with specific theoretical frameworks or design models in mind. Given this problem, we recommended four steps for effective positive psychology app design. First, developers must establish their app in a research-supported theory of change. Second, researchers must select appropriate behavioral change techniques which are consistent with their app’s goals. Third, researchers must leverage effective design principles. These steps will help mobile applications use data-driven methods for encouraging behavior change in their users. Lastly, we discuss directions for future research. In particular, researchers must investigate the effectiveness of various BCTs in positive psychology interventions. Although there is some research on this point, we do not yet clearly understand the mechanisms within the apps that lead to behavior change. Additionally, app developers must also provide data on the effectiveness of their mobile apps. As developers follow these steps for effective app development and as researchers continue to investigate what makes these apps most effective, we will provide millions of people in need with access to research-based mental health resources.Keywords: behavioral change techniques, mobile app, mobile intervention, positive psychology
Procedia PDF Downloads 2271610 Impact of Combined Heat and Power (CHP) Generation Technology on Distribution Network Development
Authors: Sreto Boljevic
Abstract:
In the absence of considerable investment in electricity generation, transmission and distribution network (DN) capacity, the demand for electrical energy will quickly strain the capacity of the existing electrical power network. With anticipated growth and proliferation of Electric vehicles (EVs) and Heat pump (HPs) identified the likelihood that the additional load from EV changing and the HPs operation will require capital investment in the DN. While an area-wide implementation of EVs and HPs will contribute to the decarbonization of the energy system, they represent new challenges for the existing low-voltage (LV) network. Distributed energy resources (DER), operating both as part of the DN and in the off-network mode, have been offered as a means to meet growing electricity demand while maintaining and ever-improving DN reliability, resiliency and power quality. DN planning has traditionally been done by forecasting future growth in demand and estimating peak load that the network should meet. However, new problems are arising. These problems are associated with a high degree of proliferation of EVs and HPs as load imposes on DN. In addition to that, the promotion of electricity generation from renewable energy sources (RES). High distributed generation (DG) penetration and a large increase in load proliferation at low-voltage DNs may have numerous impacts on DNs that create issues that include energy losses, voltage control, fault levels, reliability, resiliency and power quality. To mitigate negative impacts and at a same time enhance positive impacts regarding the new operational state of DN, CHP system integration can be seen as best action to postpone/reduce capital investment needed to facilitate promotion and maximize benefits of EVs, HPs and RES integration in low-voltage DN. The aim of this paper is to generate an algorithm by using an analytical approach. Algorithm implementation will provide a way for optimal placement of the CHP system in the DN in order to maximize the integration of RES and increase in proliferation of EVs and HPs.Keywords: combined heat & power (CHP), distribution networks, EVs, HPs, RES
Procedia PDF Downloads 2031609 The Impact of Oxytetracycline on the Aquaponic System, Biofilter, and Plants
Authors: Hassan Alhoujeiri, Angele Matrat, Sandra Beaufort, Claire joaniss Cassan, Jerome Silvester
Abstract:
Aquaponics is a sustainable food production technology, and its transition to industrial-scale systems has created several challenges that require further investigation in order to make it a robust process. One of the critical concerns is the potential accumulation of compounds from veterinary treatments, phytosanitary agents, fish feed, or simply from contaminated water sources. The accumulation of these substances could negatively impact fish health, microbial biofilters, and plant growth, thereby disrupting the system’s overall balance and functionality. The lack of legislation and knowledge regarding the presence of such compounds in aquaponic systems raises concerns about their potential impact on both system balance and food safety. In this study, we focused on the effects of oxytetracycline (OTC), an antibiotic commonly used in aquaculture, on both the microbial biofilter and plant growth. Although OTC is rarely applied in aquaponics today, the fish compartment may need to be isolated from the system during treatment, as it inhibits specific bacterial populations, which could affect the microbial biofilter's efficiency. However, questions remain about the aquaponic system's tolerance threshold, particularly in cases of treatment or residual OTC traces post-treatment. This study results indicated a decline in microbial biofilter activity to 20% compared to the control, potentially corresponding to treatments of 41 mg/L of OTC. Analysis of microbial populations in the biofilter, using flow cytometry and microscopy (confocal and scanning electron microscopy), revealed an increase in bacterial mortality without disrupting the microbial biofilm. Additionally, OTC exposure led to noticeable changes in plant morphology (e.g., color) and growth, though it did not fully inhibit development. However, no significant effects were observed on seed germination at the tested concentrations despite a measurable impact on subsequent plant growth.Keywords: aquaponic, oxytetracycline, nitrifying biofilter, plant, micropollutants, sustainability
Procedia PDF Downloads 241608 Cooperative Agents to Prevent and Mitigate Distributed Denial of Service Attacks of Internet of Things Devices in Transportation Systems
Authors: Borhan Marzougui
Abstract:
Road and Transport Authority (RTA) is moving ahead with the implementation of the leader’s vision in exploring all avenues that may bring better security and safety services to the community. Smart transport means using smart technologies such as IoT (Internet of Things). This technology continues to affirm its important role in the context of Information and Transportation Systems. In fact, IoT is a network of Internet-connected objects able to collect and exchange different data using embedded sensors. With the growth of IoT, Distributed Denial of Service (DDoS) attacks is also growing exponentially. DDoS attacks are the major and a real threat to various transportation services. Currently, the defense mechanisms are mainly passive in nature, and there is a need to develop a smart technique to handle them. In fact, new IoT devices are being used into a botnet for DDoS attackers to accumulate for attacker purposes. The aim of this paper is to provide a relevant understanding of dangerous types of DDoS attack related to IoT and to provide valuable guidance for the future IoT security method. Our methodology is based on development of the distributed algorithm. This algorithm manipulates dedicated intelligent and cooperative agents to prevent and to mitigate DDOS attacks. The proposed technique ensure a preventive action when a malicious packets start to be distributed through the connected node (Network of IoT devices). In addition, the devices such as camera and radio frequency identification (RFID) are connected within the secured network, and the data generated by it are analyzed in real time by intelligent and cooperative agents. The proposed security system is based on a multi-agent system. The obtained result has shown a significant reduction of a number of infected devices and enhanced the capabilities of different security dispositives.Keywords: IoT, DDoS, attacks, botnet, security, agents
Procedia PDF Downloads 1451607 A Comprehensive Comparative Study on Seasonal Variation of Parameters Involved in Site Characterization and Site Response Analysis by Using Microtremor Data
Authors: Yehya Rasool, Mohit Agrawal
Abstract:
The site characterization and site response analysis are the crucial steps for reliable seismic microzonation of an area. So, the basic parameters involved in these fundamental steps are required to be chosen properly in order to efficiently characterize the vulnerable sites of the study region. In this study, efforts are made to delineate the variations in the physical parameter of the soil for the summer and monsoon seasons of the year (2021) by using Horizontal-to-Vertical Spectral Ratios (HVSRs) recorded at five sites of the Indian Institute of Technology (Indian School of Mines), Dhanbad, Jharkhand, India. The data recording at each site was done in such a way that less amount of anthropogenic noise was recorded at each site. The analysis has been done for five seismic parameters like predominant frequency, H/V ratio, the phase velocity of Rayleigh waves, shear wave velocity (Vs), compressional wave velocity (Vp), and Poisson’s ratio for both the seasons of the year. From the results, it is observed that these parameters majorly vary drastically for the upper layers of soil, which in turn may affect the amplification ratios and probability of exceedance obtained from seismic hazard studies. The HVSR peak comes out to be higher in monsoon, with a shift in predominant frequency as compared to the summer season of the year 2021. Also, the drastic reduction in shear wave velocity (up to ~10 m) of approximately 7%-15% is also perceived during the monsoon period with a slight decrease in compressional wave velocity. Generally, the increase in the Poisson ratios is found to have higher values during monsoon in comparison to the summer period. Our study may be very beneficial to various agricultural and geotechnical engineering projects.Keywords: HVSR, shear wave velocity profile, Poisson ratio, microtremor data
Procedia PDF Downloads 911606 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution
Authors: Masomeh Jamshid Nejad
Abstract:
Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.Keywords: statistics, excel-based instruction, data visualization, pedagogy
Procedia PDF Downloads 551605 Dairy Value Chain: Assessing the Inter Linkage of Dairy Farm and Small-Scale Dairy Processing in Tigray: Case Study of Mekelle City
Authors: Weldeabrha Kiros Kidanemaryam, DepaTesfay Kelali Gidey, Yikaalo Welu Kidanemariam
Abstract:
Dairy services are considered as sources of income, employment, nutrition and health for smallholder rural and urban farmers. The main objective of this study is to assess the interlinkage of dairy farms and small-scale dairy processing in Mekelle, Tigray. To achieve the stated objective, a descriptive research approach was employed where data was collected from 45 dairy farmers and 40 small-scale processors and analyzed by calculating the mean values and percentages. Findings show that the dairy business in the study area is characterized by a shortage of feed and water for the farm. The dairy farm is dominated by breeds of hybrid type, followed by the so called ‘begait’. Though the farms have access to medication and vaccination for the cattle, they fell short of hygiene practices, reliable shade for the cattle and separate space for the claves. The value chain at the milk production stage is characterized by a low production rate, selling raw milk without adding value and a very meager traditional processing practice. Furthermore, small-scale milk processors are characterized by collecting milk from farmers and producing cheese, butter, ghee and sour milk. They do not engage in modern milk processing like pasteurized milk, yogurt and table butter. Most small-scale milk processors are engaged in traditional production systems. Additionally, the milk consumption and marketing part of the chain is dominated by the informal market (channel), where market problems, lack of skill and technology, shortage of loans and weak policy support are being faced as the main challenges. Based on the findings, recommendations and future research areas are forwarded.Keywords: value-chain, dairy, milk production, milk processing
Procedia PDF Downloads 371604 Study of Evaluation Model Based on Information System Success Model and Flow Theory Using Web-scale Discovery System
Authors: June-Jei Kuo, Yi-Chuan Hsieh
Abstract:
Because of the rapid growth of information technology, more and more libraries introduce the new information retrieval systems to enhance the users’ experience, improve the retrieval efficiency, and increase the applicability of the library resources. Nevertheless, few of them are discussed the usability from the users’ aspect. The aims of this study are to understand that the scenario of the information retrieval system utilization, and to know why users are willing to continuously use the web-scale discovery system to improve the web-scale discovery system and promote their use of university libraries. Besides of questionnaires, observations and interviews, this study employs both Information System Success Model introduced by DeLone and McLean in 2003 and the flow theory to evaluate the system quality, information quality, service quality, use, user satisfaction, flow, and continuing to use web-scale discovery system of students from National Chung Hsing University. Then, the results are analyzed through descriptive statistics and structural equation modeling using AMOS. The results reveal that in web-scale discovery system, the user’s evaluation of system quality, information quality, and service quality is positively related to the use and satisfaction; however, the service quality only affects user satisfaction. User satisfaction and the flow show a significant impact on continuing to use. Moreover, user satisfaction has a significant impact on user flow. According to the results of this study, to maintain the stability of the information retrieval system, to improve the information content quality, and to enhance the relationship between subject librarians and students are recommended for the academic libraries. Meanwhile, to improve the system user interface, to minimize layer from system-level, to strengthen the data accuracy and relevance, to modify the sorting criteria of the data, and to support the auto-correct function are required for system provider. Finally, to establish better communication with librariana commended for all users.Keywords: web-scale discovery system, discovery system, information system success model, flow theory, academic library
Procedia PDF Downloads 1041603 A Review of Current Research and Future Directions on Foodborne Illness and Food Safety: Understanding the Risks and Mitigation Strategies
Authors: Tuji Jemal Ahmed
Abstract:
This paper is to provides a comprehensive review of current research works on foodborne illness and food safety, including the risks associated with foodborne illnesses, the latest research on food safety, and the mitigation strategies used to prevent and control foodborne illnesses. Foodborne illness is a major public health concern that affects millions of people every year. As foodborne illnesses have grown more common and dangerous in recent years, it is vital that we research and build upon methods to ensure food remains safe throughout consumption. Additionally, this paper will discuss future directions for food safety research, including emerging technologies, changes in regulations and standards, and collaborative efforts to improve food safety. The first section of the paper provides an overview of the risks of foodborne illness, including a definition of foodborne illness, the causes of foodborne illness, the types of foodborne illnesses, and high-risk foods for foodborne illness, Health Consequences of Foodborne Illness. The second section of the paper focuses on current research on food safety, including the role of regulatory agencies in food safety, food safety standards and guidelines, emerging food safety concerns, and advances in food safety technology. The third section of the paper explores mitigation strategies for foodborne illness, including preventative measures, hazard analysis and critical control points (HACCP), good manufacturing practices (GMPs), and training and education. Finally, this paper examines future directions for food safety research, including hurdle technologies and their impact on food safety, changes in food safety regulations and standards, collaborative efforts to improve food safety, and research gaps and areas for further exploration. In general, this work provides a comprehensive review of current research and future directions in food safety and understanding the risks associated with foodborne illness. The implications of the assessment for food safety and public health are discussed, as well as recommended for research scholars.Keywords: food safety, foodborne illness, technologies, mitigation
Procedia PDF Downloads 1091602 Analyzing the Performance of Different Cost-Based Methods for the Corrective Maintenance of a System in Thermal Power Plants
Authors: Demet Ozgur-Unluakin, Busenur Turkali, S. Caglar Aksezer
Abstract:
Since the age of industrialization, maintenance has always been a very crucial element for all kinds of factories and plants. With today’s increasingly developing technology, the system structure of such facilities has become more complicated, and even a small operational disruption may return huge losses in profits for the companies. In order to reduce these costs, effective maintenance planning is crucial, but at the same time, it is a difficult task because of the complexity of systems. The most important aspect of correct maintenance planning is to understand the structure of the system, not to ignore the dependencies among the components and as a result, to model the system correctly. In this way, it will be better to understand which component improves the system more when it is maintained. Undoubtedly, proactive maintenance at a scheduled time reduces costs because the scheduled maintenance prohibits high losses in profits. But the necessity of corrective maintenance, which directly affects the situation of the system and provides direct intervention when the system fails, should not be ignored. When a fault occurs in the system, if the problem is not solved immediately and proactive maintenance time is awaited, this may result in increased costs. This study proposes various maintenance methods with different efficiency measures under corrective maintenance strategy on a subsystem of a thermal power plant. To model the dependencies between the components, dynamic Bayesian Network approach is employed. The proposed maintenance methods aim to minimize the total maintenance cost in a planning horizon, as well as to find the most appropriate component to be attacked on, which improves the system reliability utmost. Performances of the methods are compared under corrective maintenance strategy. Furthermore, sensitivity analysis is also applied under different cost values. Results show that all fault effect methods perform better than the replacement effect methods and this conclusion is also valid under different downtime cost values.Keywords: dynamic Bayesian networks, maintenance, multi-component systems, reliability
Procedia PDF Downloads 1301601 Comparison of Fuel Properties from Species of Microalgae and Selected Second-Generation Oil Feedstocks
Authors: Andrew C. Eloka Eboka, Freddie L. Inambao
Abstract:
Comparative investigation and assessment of microalgal technology as a biodiesel production option was studied alongside other second generation feedstocks. This was carried out by comparing the fuel properties of species of Chlorella vulgaris, Duneliella spp, Synechococus spp and Senedesmus spp with the feedstock of Jatropha (ex-basirika variety), Hura crepitans, rubber and Natal mahogany seed oils. The micro-algae were cultivated in an open pond using a photobioreactor (New Brunsink set-up model BF-115 Bioflo/CelliGen made in the US) with operating parameters: 14L capacity, working volume of 7.5L media, including 10% inoculum, at optical density of 3.144 @540nm and light intensity of 200 lux, for 23 and 16 days respectively. Various produced/accumulated biomasses were harvested by draining, flocculation, centrifugation, drying and then subjected to lipid extraction processes. The oils extracted from the algae and feedstocks were characterised and used to produce biodiesel fuels, by the transesterification method, using modified optimization protocol. Fuel properties of the final biodiesel products were evaluated for chemo-physical and fuel properties. Results revealed Chlorella vulgaris as the best strain for biomass cultivation, having the highest lipid productivity (5.2mgL-1h-1), the highest rate of CO2 absorption (17.85mgL-1min-1) and the average carbon sequestration in the form of CO2 was 76.6%. The highest biomass productivity was 35.1mgL-1h-1 (Chlorella), while Senedesmus had the least output (3.75mgL-1h-1, 11.73mgL-1min-1). All species had good pH value adaptation, ranging from 6.5 to 8.5. The fuel properties of the micro-algal biodiesel in comparison with Jatropha, rubber, Hura and Natal mahogany were within ASTM specification and AGO used as the control. Fuel cultivation from microalgae is feasible and will revolutionise the biodiesel industry.Keywords: biodiesel, fuel properties, microalgae, second generation, seed oils, feedstock, photo-bioreactor, open pond
Procedia PDF Downloads 3631600 Brain Connectome of Glia, Axons, and Neurons: Cognitive Model of Analogy
Authors: Ozgu Hafizoglu
Abstract:
An analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with physical, behavioral, principal relations that are essential to learning, discovery, and innovation. The Cognitive Model of Analogy (CMA) leads and creates patterns of pathways to transfer information within and between domains in science, just as happens in the brain. The connectome of the brain shows how the brain operates with mental leaps between domains and mental hops within domains and the way how analogical reasoning mechanism operates. This paper demonstrates the CMA as an evolutionary approach to science, technology, and life. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions in the new era, especially post-pandemic. In this paper, we will reveal how to draw an analogy to scientific research to discover new systems that reveal the fractal schema of analogical reasoning within and between the systems like within and between the brain regions. Distinct phases of the problem-solving processes are divided thusly: stimulus, encoding, mapping, inference, and response. Based on the brain research so far, the system is revealed to be relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain’s mechanism in macro context; brain and spinal cord, and micro context: glia and neurons, relative to matching conditions of analogical reasoning and relational information, encoding, mapping, inference and response processes, and verification of perceptual responses in four-term analogical reasoning. Finally, we will relate all these terminologies with these mental leaps, mental maps, mental hops, and mental loops to make the mental model of CMA clear.Keywords: analogy, analogical reasoning, brain connectome, cognitive model, neurons and glia, mental leaps, mental hops, mental loops
Procedia PDF Downloads 1651599 Fault Prognostic and Prediction Based on the Importance Degree of Test Point
Authors: Junfeng Yan, Wenkui Hou
Abstract:
Prognostics and Health Management (PHM) is a technology to monitor the equipment status and predict impending faults. It is used to predict the potential fault and provide fault information and track trends of system degradation by capturing characteristics signals. So how to detect characteristics signals is very important. The select of test point plays a very important role in detecting characteristics signal. Traditionally, we use dependency model to select the test point containing the most detecting information. But, facing the large complicated system, the dependency model is not built so easily sometimes and the greater trouble is how to calculate the matrix. Rely on this premise, the paper provide a highly effective method to select test point without dependency model. Because signal flow model is a diagnosis model based on failure mode, which focuses on system’s failure mode and the dependency relationship between the test points and faults. In the signal flow model, a fault information can flow from the beginning to the end. According to the signal flow model, we can find out location and structure information of every test point and module. We break the signal flow model up into serial and parallel parts to obtain the final relationship function between the system’s testability or prediction metrics and test points. Further, through the partial derivatives operation, we can obtain every test point’s importance degree in determining the testability metrics, such as undetected rate, false alarm rate, untrusted rate. This contributes to installing the test point according to the real requirement and also provides a solid foundation for the Prognostics and Health Management. According to the real effect of the practical engineering application, the method is very efficient.Keywords: false alarm rate, importance degree, signal flow model, undetected rate, untrusted rate
Procedia PDF Downloads 3791598 A Case Study of Rainfall Derived Inflow/Infiltration in a Separate Sewer System in Gwangju, Korea
Authors: Bumjo Kim, Hyun Jin Kim, Joon Ha Kim
Abstract:
The separate sewer system is that collects the wastewater as a sewer pipe and rainfall as a stormwater pipe separately, and then sewage is treated in the wastewater treatment plant, the stormwater is discharged to rivers or lakes through stormwater drainage pipes. Unfortunately, even for separate sewer systems, it is not possible to prevent Rainfall Driven Inflow/Infiltration(RDII) completely to the sewer pipe. Even if the sewerage line is renovated, there is an ineluctable RDII due to the combined sewer system in the house or the difficulty of sewage maintenance in private areas. The basic statistical analysis was performed using environmental data including rainfall, sewage, water qualities and groundwater level in the strict of Gwangju in South Korea. During rainfall in the target area, RDII showed an increased rate of 13.4 ~ 53.0% compared to that of a clear day and showed a rapid hydrograph response of 0.3 ~ 3.0 hr. As a result of water quality analysis, BOD5 concentration decreased by 17.3 % and salinity concentration decreased by 8.8 % at the representative spot in the project area compared to the sunny day during rainfall. In contrast to the seasonal fluctuation range of 0.38 m ~ 0.55 m in groundwater in Gwangju area and 0.58 m ~ 0.78 m in monthly fluctuation range, while the difference between groundwater level and the depth of sewer pipe laying was 2.70 m on average, which is larger than the range of fluctuation. Comprehensively, it can be concluded that the increasing of flowrate at sewer line is due to not infiltration water caused by groundwater level rise, construction failure, cracking due to joint failure or conduit deterioration, rainfall was directly inflowed into the sewer line rapidly. Acknowledgements: This work was supported by the 'Climate Technology Development and Application' research project (#K07731) through a grant provided by GIST in 2017.Keywords: ground water, rainfall, rainfall driven inflow/infiltration, separate sewer system
Procedia PDF Downloads 1611597 Τhe Importance of Previous Examination Results, in Futural Differential Diagnostic Procedures and Especially in the Era of Covid-19
Authors: Angelis P. Barlampas
Abstract:
Purpose or Learning Objective It is well known that previous examinations play a major role in futural diagnosis, thus avoiding unnecessary new exams that cost in time and money both for the patient and the health system. A case is presented in which past patient’s results, in combination with the least needed new tests, give an easy final diagnosis. Methods or Background A middle aged man visited the emergency department complaining of hard controlled, persisting fever for the last few days. Laboratory tests showed an elevated number of white blood cells with neutrophil shift and abnormal CRP. The patient was admitted to hospital a month ago for continuing lungs symptomatology after a recent covid-19 infection. Results or Findings Computed tomography scanning showed a solid mass with spiculating margins in right lower lobe. After intravenous iodine contrast administration, there was mildly peripheral enhancement and eccentric non enhancing area. A pneumonic cancer was suspected. Comparison with the patient’s latest computed tomography revealed no mass in the area of interest but only signs of recent post covid-19 lung parenchyma abnormalities. Any new mass that appears in a month’s time span can not be a cancer but a benign lesion. It was obvious that an abscess was the most suitable explanation. The patient was admitted to hospital, and antibiotic therapy was given, with very good results. After a few days, the patient was afebrile and in good condition. Conclusion In this case , a PET scan or a biopsy was avoided, thanks to the patient’s medical history and the availability of previous examinations. It is worthy encouraging the patients to keep their medical records and organizing more efficiently the health system with the current technology of archiving the medical examinations, too.Keywords: covid-19, chest ct, cancer, abscess, fever
Procedia PDF Downloads 601596 A Framework for Building Information Modelling Execution Plan in the Construction Industry, Lagos State, Nigeria
Authors: Tosin Deborah Akanbi
Abstract:
The Building Information Modeling Execution Plan (BEP) is a document that manifests the specifications for the adoption and execution of building information modeling in the construction sector in an organized manner so as to attain the listed goals. In this regard, the study examined the barriers to the adoption of building information modeling, evaluated the effect of building information modeling adoption characteristics on the key elements of a building information modeling execution plan and developed a strategic framework for a BEP in the Lagos State construction industry. Data were gathered through a questionnaire survey with 332 construction professionals in the study area. Three online structured interviews were conducted to support and validate the findings of the quantitative analysis. The results showed the significant relationships and connections between the variables in the framework: BIM usage and model quality control (aBIMskill -> dMQ, Beta = 0.121, T statistics = 1.829), BIM adoption characteristics and information exchange (bBIM_CH -> dIE, Beta = 0.128, T statistics = 1.727), BIM adoption characteristics and process design (bBIM_CH -> dPD, Beta = 0.170, T statistics = 2.754), BIM adoption characteristics and roles and responsibilities (bBIM_CH -> dRR, Beta = 0.131, T statistics = 2.181), interest BIM barriers and BIM adoption characteristics (cBBIM_INT -> bBIM_CH, Beta = 0.137, T statistics = 2.309), legal BIM barriers and BIM adoption characteristics (cBBIM_LEG -> bBIM_CH, Beta = 0.168, T statistics = 2.818), professional BIM barriers and BIM adoption characteristics (cBBIM_PRO -> bBIM_CH, Beta = 0.152, T statistics = 2.645). The results also revealed that seven final themes were generated, namely: model structure and process design, BIM information exchange and collaboration procedures, project goals and deliverables, project model quality control, roles and responsibilities, reflect Lagos state construction industry and validity of the BEP framework. Thus, there is a need for the policy makers to direct interventions to promote, encourage and support the understanding and adoption of BIM by emphasizing the various benefits of using the technology in the Lagos state construction industry.Keywords: building information modelling execution plan, BIM adoption characteristics, BEP framework, construction industry
Procedia PDF Downloads 201595 Transmission Line Protection Challenges under High Penetration of Renewable Energy Sources and Proposed Solutions: A Review
Authors: Melake Kuflom
Abstract:
European power networks involve the use of multiple overhead transmission lines to construct a highly duplicated system that delivers reliable and stable electrical energy to the distribution level. The transmission line protection applied in the existing GB transmission network are normally independent unit differential and time stepped distance protection schemes, referred to as main-1 & main-2 respectively, with overcurrent protection as a backup. The increasing penetration of renewable energy sources, commonly referred as “weak sources,” into the power network resulted in the decline of fault level. Traditionally, the fault level of the GB transmission network has been strong; hence the fault current contribution is more than sufficient to ensure the correct operation of the protection schemes. However, numerous conventional coal and nuclear generators have been or about to shut down due to the societal requirement for CO2 emission reduction, and this has resulted in a reduction in the fault level on some transmission lines, and therefore an adaptive transmission line protection is required. Generally, greater utilization of renewable energy sources generated from wind or direct solar energy results in a reduction of CO2 carbon emission and can increase the system security and reliability but reduces the fault level, which has an adverse effect on protection. Consequently, the effectiveness of conventional protection schemes under low fault levels needs to be reviewed, particularly for future GB transmission network operating scenarios. The proposed paper will evaluate the transmission line challenges under high penetration of renewable energy sources andprovides alternative viable protection solutions based on the problem observed. The paper will consider the assessment ofrenewable energy sources (RES) based on a fully rated converter technology. The DIgSILENT Power Factory software tool will be used to model the network.Keywords: fault level, protection schemes, relay settings, relay coordination, renewable energy sources
Procedia PDF Downloads 2081594 An Empirical Study on the Integration of Listening and Speaking Activities with Writing Instruction for Middles School English Language Learners
Authors: Xueyan Hu, Liwen Chen, Weilin He, Sujie Peng
Abstract:
Writing is an important but challenging skill For English language learners. Due to the small amount of time allocated for writing classes at schools, students have relatively few opportunities to practice writing in the classroom. While the practice of integrating listening and speaking activates with writing instruction has been used for adult English language learners, its application for young English learners has seldom been examined due to the challenge of listening and speaking activities for young English language learners. The study attempted to integrating integrating listening and speaking activities with writing instruction for middle school English language learners so as to improving their writing achievements and writing abilities in terms of the word use, coherence, and complexity in their writings. Guided by Gagne's information processing learning theory and memetics, this study conducted a 8-week writing instruction with an experimental class (n=44) and a control class (n=48) . Students in the experimental class participated in a series of listening and retelling activities about a writing sample the teacher used for writing instruction during each period of writing class. Students in the control class were taught traditionally with teachers’ direction instruction using the writing sample. Using the ANCOVA analysis of the scores of students’ writing, word-use, Chinese-English translation and the text structure, this study showed that the experimental writing instruction can significantly improve students’ writing performance. Compared with the students in the control class, the students in experimental class had significant better performance in word use and complexity in their essays. This study provides useful enlightenment for the teaching of English writing for middle school English language learners. Teachers can skillfully use information technology to integrate listening, speaking, and writing teaching, considering students’ language input and output. Teachers need to select suitable and excellent composition templates for students to ensure their high-quality language input.Keywords: wring instruction, retelling, English language learners, listening and speaking
Procedia PDF Downloads 861593 Formulation and Evaluation of Metformin Hydrochloride Microparticles via BÜCHI Nano-Spray Dryer B-90
Authors: Tamer Shehata
Abstract:
Recently, nanotechnology acquired a great interest in the field of pharmaceutical production. Several pharmaceutical equipment were introduced into the research field for production of nanoparticles, among them, BÜCHI’ fourth generation nano-spray dryer B-90. B-90 is specialized with single step of production and drying of nano and microparticles. Currently, our research group is investigating several pharmaceutical formulations utilizing BÜCHI Nano-Spray Dryer B-90 technology. One of our projects is the formulation and evaluation of metformin hydrochloride mucoadhesive microparticles for treatment of type 2-diabetis. Several polymers were investigated, among them, gelatin and sodium alginate. The previous polymers are natural polymers with mucoadhesive properties. Preformulation studies such as atomization head mesh size, flow rate, head temperature, polymer solution viscosity and surface tension were performed. Postformulation characters such as particle size, flowability, surface scan and dissolution profile were evaluated. Finally, the pharmacological activity of certain selected formula was evaluated in streptozotocin-induced diabetic rats. B-90’spray head was 7 µm hole heated to 120 with air flow rate 3.5 mL/min. The viscosity of the solution was less than 11.5 cP with surface tension less than 70.1 dyne/cm. Successfully, discrete, non-aggregated particles and free flowing powders with particle size was less than 2000 nm were obtained. Gelatin and Sodium alginate combination in ratio 1:3 were successfully sustained the in vitro release profile of the drug. Hypoglycemic evaluation of the previous formula showed a significant reduction of blood glucose level over 24 h. In conclusion, mucoadhesive metformin hydrochloride microparticles obtained from B-90 could offer a convenient dosage form with enhanced hypoglycemic activity.Keywords: mucoadhesive, microparticles, metformin hydrochloride, nano-spray dryer
Procedia PDF Downloads 3111592 Factors Promoting French-English Tweets in France
Authors: Taoues Hadour
Abstract:
Twitter has become a popular means of communication used in a variety of fields, such as politics, journalism, and academia. This widely used online platform has an impact on the way people express themselves and is changing language usage worldwide at an unprecedented pace. The language used online reflects the linguistic battle that has been going on for several decades in French society. This study enables a deeper understanding of users' linguistic behavior online. The implications are important and allow for a rise in awareness of intercultural and cross-language exchanges. This project investigates the mixing of French-English language usage among French users of Twitter using a topic analysis approach. This analysis draws on Gumperz's theory of conversational switching. In order to collect tweets at a large scale, the data was collected in R using the rtweet package to access and retrieve French tweets data through Twitter’s REST and stream APIs (Application Program Interface) using the software RStudio, the integrated development environment for R. The dataset was filtered manually and certain repetitions of themes were observed. A total of nine topic categories were identified and analyzed in this study: entertainment, internet/social media, events/community, politics/news, sports, sex/pornography, innovation/technology, fashion/make up, and business. The study reveals that entertainment is the most frequent topic discussed on Twitter. Entertainment includes movies, music, games, and books. Anglicisms such as trailer, spoil, and live are identified in the data. Change in language usage is inevitable and is a natural result of linguistic interactions. The use of different languages online is just an example of what the real world would look like without linguistic regulations. Social media reveals a multicultural and multilinguistic richness which can deepen and expand our understanding of contemporary human attitudes.Keywords: code-switching, French, sociolinguistics, Twitter
Procedia PDF Downloads 1381591 Urinary Exosome miR-30c-5p as a Biomarker for Early-Stage Clear Cell Renal Cell Carcinoma
Authors: Shangqing Song, Bin Xu, Yajun Cheng, Zhong Wang
Abstract:
miRNAs derived from exosomes exist in a body fluid such as urine were regarded as potential biomarkers for various human cancers diagnosis and prognosis, as mature miRNAs can be steadily preserved by exosomes. However, its potential value in clear cell renal cell carcinoma (ccRCC) diagnosis and prognosis remains unclear. In the present study, differentially expressed miRNAs from urinal exosomes were identified by next-generation sequencing (NGS) technology. The 16 differentially expressed miRNAs were identified between ccRCC patients and healthy donors. To explore the specific diagnosis biomarker of ccRCC, we validated these urinary exosomes from 70 early-stage renal cancer patients, 30 healthy people and other urinary system cancers, including 30 early-stage prostate cancer patients and 30 early-stage bladder cancer patients by qRT-PCR. The results showed that urinary exosome miR-30c-5p could be stably amplified and meanwhile the expression of miR-30c-5p has no significant difference between other urinary system cancers and healthy control, however, expression level of miR-30c-5p in urinary exosomal of ccRCC patients was lower than healthy people and receiver operation characterization (ROC) curve showed that the area under the curve (AUC) values was 0.8192 (95% confidence interval was 0.7388-0.8996, P= 0.0000). In addition, up-regulating miR-30c-5p expression could inhibit renal cell carcinoma cells growth. Lastly, HSP5A was found as a direct target gene of miR-30c-5p. HSP5A depletion reversed the promoting effect of ccRCC growth casued by miR-30c-5p inhibitor, respectively. In conclusion, this study demonstrated that urinary exosomal miR-30c-5p is readily accessible as diagnosis biomarker of early-stage ccRCC, and miR-30c-5p might modulate the expression of HSPA5, which correlated with the progression of ccRCC.Keywords: clear cell renal cell carcinoma, exosome, HSP5A, miR-30c-5p
Procedia PDF Downloads 2701590 NiFe-Type Catalysts for Anion Exchange Membrane (AEM) Electrolyzers
Authors: Boldin Roman, Liliana Analía Diaz
Abstract:
As the hydrogen economy continues to expand, reducing energy consumption and emissions while stimulating economic growth, the development of efficient and cost-effective hydrogen production technologies is critical. Among various methods, anion exchange membrane (AEM) water electrolysis stands out due to its potential for using non-noble metal catalysts. The exploration and enhancement of non-noble metal catalysts, such as NiFe-type catalysts, are pivotal for the advancement of AEM technology, ensuring its commercial viability and environmental sustainability. NiFe-type catalysts were synthesized through electrodeposition and characterized both electrochemically and physico-chemically. Various supports, including Ni foam and Ni mesh, were used as porous transport layers (PTLs) to evaluate the effective catalyst thickness and the influence of the PTL in a 5 cm² AEM electrolyzer. This methodological approach allows for a detailed assessment of catalyst performance under operational conditions typical of industrial hydrogen production. The study revealed that electrodeposited non-noble multi-metallic catalysts maintain stable performance as anodes in AEM water electrolysis. NiFe-type catalysts demonstrated superior activity, with the NiFeCoP alloy outperforming others by delivering the lowest overpotential and the highest current density. Furthermore, the use of different PTLs showed significant effects on the electrochemical behavior of the catalysts, indicating that PTL selection is crucial for optimizing performance and efficiency in AEM electrolyzers. Conclusion: The research underscores the potential of non-noble metal catalysts in enhancing efficiency and reducing the costs of AEM electrolysers. The findings highlight the importance of catalyst and PTL optimization in developing scalable and economically viable hydrogen production technologies. Continued innovation in this area is essential for supporting the growth of the hydrogen economy and achieving sustainable energy solutions.Keywords: AEMWE, electrocatalyst, hydrogen production, water electrolysis.
Procedia PDF Downloads 321589 Shape Management Method of Large Structure Based on Octree Space Partitioning
Authors: Gichun Cha, Changgil Lee, Seunghee Park
Abstract:
The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning
Procedia PDF Downloads 2971588 Dynamic Control Theory: A Behavioral Modeling Approach to Demand Forecasting amongst Office Workers Engaged in a Competition on Energy Shifting
Authors: Akaash Tawade, Manan Khattar, Lucas Spangher, Costas J. Spanos
Abstract:
Many grids are increasing the share of renewable energy in their generation mix, which is causing the energy generation to become less controllable. Buildings, which consume nearly 33% of all energy, are a key target for demand response: i.e., mechanisms for demand to meet supply. Understanding the behavior of office workers is a start towards developing demand response for one sector of building technology. The literature notes that dynamic computational modeling can be predictive of individual action, especially given that occupant behavior is traditionally abstracted from demand forecasting. Recent work founded on Social Cognitive Theory (SCT) has provided a promising conceptual basis for modeling behavior, personal states, and environment using control theoretic principles. Here, an adapted linear dynamical system of latent states and exogenous inputs is proposed to simulate energy demand amongst office workers engaged in a social energy shifting game. The energy shifting competition is implemented in an office in Singapore that is connected to a minigrid of buildings with a consistent 'price signal.' This signal is translated into a 'points signal' by a reinforcement learning (RL) algorithm to influence participant energy use. The dynamic model functions at the intersection of the points signals, baseline energy consumption trends, and SCT behavioral inputs to simulate future outcomes. This study endeavors to analyze how the dynamic model trains an RL agent and, subsequently, the degree of accuracy to which load deferability can be simulated. The results offer a generalizable behavioral model for energy competitions that provides the framework for further research on transfer learning for RL, and more broadly— transactive control.Keywords: energy demand forecasting, social cognitive behavioral modeling, social game, transfer learning
Procedia PDF Downloads 1091587 Consistent Testing for an Implication of Supermodular Dominance with an Application to Verifying the Effect of Geographic Knowledge Spillover
Authors: Chung Danbi, Linton Oliver, Whang Yoon-Jae
Abstract:
Supermodularity, or complementarity, is a popular concept in economics which can characterize many objective functions such as utility, social welfare, and production functions. Further, supermodular dominance captures a preference for greater interdependence among inputs of those functions, and it can be applied to examine which input set would produce higher expected utility, social welfare, or production. Therefore, we propose and justify a consistent testing for a useful implication of supermodular dominance. We also conduct Monte Carlo simulations to explore the finite sample performance of our test, with critical values obtained from the recentered bootstrap method, with and without the selective recentering, and the subsampling method. Under various parameter settings, we confirmed that our test has reasonably good size and power performance. Finally, we apply our test to compare the geographic and distant knowledge spillover in terms of their effects on social welfare using the National Bureau of Economic Research (NBER) patent data. We expect localized citing to supermodularly dominate distant citing if the geographic knowledge spillover engenders greater social welfare than distant knowledge spillover. Taking subgroups based on firm and patent characteristics, we found that there is industry-wise and patent subclass-wise difference in the pattern of supermodular dominance between localized and distant citing. We also compare the results from analyzing different time periods to see if the development of Internet and communication technology has changed the pattern of the dominance. In addition, to appropriately deal with the sparse nature of the data, we apply high-dimensional methods to efficiently select relevant data.Keywords: supermodularity, supermodular dominance, stochastic dominance, Monte Carlo simulation, bootstrap, subsampling
Procedia PDF Downloads 1301586 Assessment of Hydrogen Demand for Different Technological Pathways to Decarbonise the Aviation Sector in Germany
Authors: Manish Khanra, Shashank Prabhu
Abstract:
The decarbonization of hard-to-abate sectors is currently high on the agenda in the EU and its member states, as these sectors have substantial shares in overall GHG emissions while it is facing serious challenges to decarbonize. In particular, the aviation sector accounts for 2.8% of global anthropogenic CO₂ emissions. These emissions are anticipated to grow dramatically unless immediate mitigating efforts are implemented. Hydrogen and its derivatives based on renewable electricity can have a key role in the transition towards CO₂-neutral flights. The substantial shares of energy carriers in the form of drop-in fuel, direct combustion and Hydrogen-to-Electric are promising in most scenarios towards 2050. For creating appropriate policies to ramp up the production and utilisation of hydrogen commodities in the German aviation sector, a detailed analysis of the spatial distribution of supply-demand sites is essential. The objective of this research work is to assess the demand for hydrogen-based alternative fuels in the German aviation sector to achieve the perceived goal of the ‘Net Zero’ scenario by 2050. Here, the analysis of the technological pathways for the production and utilisation of these fuels in various aircraft options is conducted for reaching mitigation targets. Our method is based on data-driven bottom-up assessment, considering production and demand sites and their spatial distribution. The resulting energy demand and its spatial distribution with consideration of technology diffusion lead to a possible transition pathway of the aviation sector to meet short-term and long-term mitigation targets. Additionally, to achieve mitigation targets in this sector, costs and policy aspects are discussed, which would support decision-makers from airline industries, policymakers and the producers of energy commodities.Keywords: the aviation sector, hard-to-abate sectors, hydrogen demand, alternative fuels, technological pathways, data-driven approach
Procedia PDF Downloads 131