Search results for: energy efficiency measures
3104 Metal-Based Deep Eutectic Solvents for Extractive Desulfurization of Fuels: Analysis from Molecular Dynamics Simulations
Authors: Aibek Kukpayev, Dhawal Shah
Abstract:
Combustion of sour fuels containing high amount of sulfur leads to the formation of sulfur oxides, which adversely harm the environment and has a negative impact on human health. Considering this, several legislations have been imposed to bring down the sulfur content in fuel to less than 10 ppm. In recent years, novel deep eutectic solvents (DESs) have been developed to achieve deep desulfurization, particularly to extract thiophenic compounds from liquid fuels. These novel DESs, considered as analogous to ionic liquids are green, eco-friendly, inexpensive, and sustainable. We herein, using molecular dynamic simulation, analyze the interactions of metal-based DESs with model oil consisting of thiophenic compounds. The DES used consists of polyethylene glycol (PEG-200) as a hydrogen bond donor, choline chloride (ChCl) or tetrabutyl ammonium chloride (TBAC) as a hydrogen bond acceptor, and cobalt chloride (CoCl₂) as metal salt. In particular, the combination of ChCl: PEG-200:CoCl₂ at a ratio 1:2:1 and the combination of TBAC:PEG-200:CoCl₂ at a ratio 1:2:0.25 were simulated, separately, with model oil consisting of octane and thiophenes at 25ᵒC and 1 bar. The results of molecular dynamics simulations were analyzed in terms of interaction energies between different components. The simulations revealed a stronger interaction between DESs/thiophenes as compared with octane/thiophenes, suggestive of an efficient desulfurization process. In addition, our analysis suggests that the choice of hydrogen bond acceptor strongly influences the efficiency of the desulfurization process. Taken together, the results also show the importance of the metal ion, although present in small amount, in the process, and the role of the polymer in desulfurization of the model fuel.Keywords: deep eutectic solvents, desulfurization, molecular dynamics simulations, thiophenes
Procedia PDF Downloads 1503103 Coupling Random Demand and Route Selection in the Transportation Network Design Problem
Authors: Shabnam Najafi, Metin Turkay
Abstract:
Network design problem (NDP) is used to determine the set of optimal values for certain pre-specified decision variables such as capacity expansion of nodes and links by optimizing various system performance measures including safety, congestion, and accessibility. The designed transportation network should improve objective functions defined for the system by considering the route choice behaviors of network users at the same time. The NDP studies mostly investigated the random demand and route selection constraints separately due to computational challenges. In this work, we consider both random demand and route selection constraints simultaneously. This work presents a nonlinear stochastic model for land use and road network design problem to address the development of different functional zones in urban areas by considering both cost function and air pollution. This model minimizes cost function and air pollution simultaneously with random demand and stochastic route selection constraint that aims to optimize network performance via road capacity expansion. The Bureau of Public Roads (BPR) link impedance function is used to determine the travel time function in each link. We consider a city with origin and destination nodes which can be residential or employment or both. There are set of existing paths between origin-destination (O-D) pairs. Case of increasing employed population is analyzed to determine amount of roads and origin zones simultaneously. Minimizing travel and expansion cost of routes and origin zones in one side and minimizing CO emission in the other side is considered in this analysis at the same time. In this work demand between O-D pairs is random and also the network flow pattern is subject to stochastic user equilibrium, specifically logit route choice model. Considering both demand and route choice, random is more applicable to design urban network programs. Epsilon-constraint is one of the methods to solve both linear and nonlinear multi-objective problems. In this work epsilon-constraint method is used to solve the problem. The problem was solved by keeping first objective (cost function) as the objective function of the problem and second objective as a constraint that should be less than an epsilon, where epsilon is an upper bound of the emission function. The value of epsilon should change from the worst to the best value of the emission function to generate the family of solutions representing Pareto set. A numerical example with 2 origin zones and 2 destination zones and 7 links is solved by GAMS and the set of Pareto points is obtained. There are 15 efficient solutions. According to these solutions as cost function value increases, emission function value decreases and vice versa.Keywords: epsilon-constraint, multi-objective, network design, stochastic
Procedia PDF Downloads 6483102 The Role of Community Beliefs and Practices on the Spread of Ebola in Uganda, September 2022
Authors: Helen Nelly Naiga, Jane Frances Zalwango, Saudah N. Kizito, Brian Agaba, Brenda N Simbwa, Maria Goretti Zalwango, Richard Migisha, Benon Kwesiga, Daniel Kadobera, Alex Ario Riolexus, Sarah Paige, Julie R. Harris
Abstract:
Background: Traditional community beliefs and practices can facilitate the spread of Ebola virus during outbreaks. On September 20, 2022, Uganda declared a Sudan Virus Disease (SVD) outbreak after a case was confirmed in Mubende District. During September–November 2022, the outbreak spread to eight additional districts. We investigated the role of community beliefs and practices in the spread of SUDV in Uganda in 2022. Methods: A qualitative study was conducted in Mubende, Kassanda, and Kyegegwa districts in February 2023. We conducted nine focus group discussions (FGDs) and six key informant interviews (KIIs). FGDs included SVD survivors, household members of SVD patients, traditional healers, religious leaders, and community leaders. Key informants included community, political, and religious leaders, traditional healers, and health workers. We asked about community beliefs and practices to understand if and how they contributed to the spread of SUDV. Interviews were recorded, translated, transcribed, and analyzed thematically. Results: Frequently-reported themes included beliefs that the community deaths, later found to be due to SVD, were the result of witchcraft or poisoning. Key informants reported that SVD patients frequently first consulted traditional healers or spiritual leaders before seeking formal healthcare, and noted that traditional healers treated patients with signs and symptoms of SVD without protective measures. Additional themes included religious leaders conducting laying-on-of-hands prayers for SVD patients and symptomatic contacts, SVD patients and their symptomatic contacts hiding in friends’ homes, and exhumation of SVD patients originally buried in safe and dignified burials, to enable traditional burials. Conclusion: Multiple community beliefs and practices likely promoted SVD outbreak spread during the 2022 outbreak in Uganda. Engaging traditional and spiritual healers early during similar outbreaks through risk communication and community engagement efforts could facilitate outbreak control. Targeted community messaging, including clear biological explanations for clusters of deaths and information on the dangers of exhuming bodies of SVD patients, could similarly facilitate improved control in future outbreaks in Uganda.Keywords: Ebola, Sudan virus, outbreak, beliefs, traditional
Procedia PDF Downloads 583101 Influence of the Adsorption of Anionic–Nonionic Surfactants/Silica Nanoparticles Mixture on Clay Rock Minerals in Chemical Enhanced Oil Recovery
Authors: C. Mendoza Ramírez, M. Gambús Ordaz, R. Mercado Ojeda.
Abstract:
Chemical solutions flooding with surfactants, based on their property of reducing the interfacial tension between crude oil and water, is a potential application of chemical enhanced oil recovery (CEOR), however, the high-rate retention of surfactants associated with adsorption in the porous medium and the complexity of the mineralogical composition of the reservoir rock generates a limitation in the efficiency of displacement of crude oil. This study evaluates the effect of the concentration of a mixture of anionic-non-ionic surfactants with silica nanoparticles, in a rock sample composed of 25.14% clay minerals of the kaolinite, chlorite, halloysite and montmorillonite type, according to the results of X-Ray Diffraction analysis and Scanning Electron Spectrometry (XRD and SEM, respectively). The amount of the surfactant mixture adsorbed on the clay rock minerals was analyzed from the construction of its calibration curve and the 4-Region Isotherm Model in a UV-Visible spectroscopy. The adsorption rate of the surfactant in the clay rock averages 32% across all concentrations, influenced by the presence of the surface area of the substrate with a value of 1.6 m2/g and by the mineralogical composition of the clay that increases the cation exchange capacity (CEC). In addition, on Region I and II a final concentration measurement is not evident in the UV-VIS, due to its ionic nature, its high affinity with the clay rock and its low concentration. Finally, for potential CEOR applications, the adsorption of these mixed surfactant systems is considered due to their industrial relevance and it is concluded that it is possible to use concentrations in Region III and IV; initially the adsorption has an increasing slope and then reaches zero in the equilibrium where interfacial tension values are reached in the order of x10-1 mN/m.Keywords: anionic–nonionic surfactants, clay rock, adsorption, 4-region isotherm model, cation exchange capacity, critical micelle concentration, enhanced oil recovery
Procedia PDF Downloads 733100 The Evaluation of Current Pile Driving Prediction Methods for Driven Monopile Foundations in London Clay
Authors: John Davidson, Matteo Castelletti, Ismael Torres, Victor Terente, Jamie Irvine, Sylvie Raymackers
Abstract:
The current industry approach to pile driving predictions consists of developing a model of the hammer-pile-soil system which simulates the relationship between soil resistance to driving (SRD) and blow counts (or pile penetration per blow). The SRD methods traditionally used are broadly based on static pile capacity calculations. The SRD is used in combination with the one-dimensional wave equation model to indicate the anticipated blowcounts with depth for specific hammer energy settings. This approach has predominantly been calibrated on relatively long slender piles used in the oil and gas industry but is now being extended to allow calculations to be undertaken for relatively short rigid large diameter monopile foundations. This paper evaluates the accuracy of current industry practice when applied to a site where large diameter monopiles were installed in predominantly stiff fissured clay. Actual geotechnical and pile installation data, including pile driving records and signal matching analysis (based upon pile driving monitoring techniques), were used for the assessment on the case study site.Keywords: driven piles, fissured clay, London clay, monopiles, offshore foundations
Procedia PDF Downloads 2263099 Combination Method Cold Plasma and Liquid Threads
Authors: Nino Tsamalaidze
Abstract:
Cold plasma is an ionized neutral gas with a temperature of 30-40 degrees, but the impact of HP includes not only gas, but also active molecules, charged particles, heat and UV radiation of low power The main goal of the technology we describe is to launch the natural function of skin regeneration and improve the metabolism inside, which leads to a huge effect of rejuvenation. In particular: eliminate fine mimic wrinkles; get rid of wrinkles around the mouth (purse-string wrinkles); reduce the overhang of the upper eyelid; eliminate bags under the eyes; provide a lifting effect on the oval of the face; reduce stretch marks; shrink pores; even out the skin, reduce the appearance of acne, scars; remove pigmentation. A clear indication of the major findings of the study is based on the current patients practice. The method is to use combination of cold plasma and liquid threats. The advantage of cold plasma is undoubtedly its efficiency, the result of its implementation can be compared with the result of a surgical facelift, despite the fact that the procedure is non-invasive and the risks are minimized. Another advantage is that the technique can be applied on the most sensitive skin of the face - these are the eyelids and the space around the eyes. Cold plasma is one of the few techniques that eliminates bags under the eyes and overhanging eyelids, while not violating the integrity of the tissues. In addition to rejuvenation and lifting effect, among the benefits of cold plasma is also getting rid of scars, kuperoze, stretch marks and other skin defects, plasma allows to get rid of acne, seborrhea, skin fungus and even heals ulcers. The cold plasma method makes it possible to achieve a result similar to blepharoplasty. Carried out on the skin of the eyelids, the procedure allows non-surgical correction of the eyelid line in 3-4 sessions. One of the undoubted advantages of this method is a short rehabilitation and rapid healing of the skin.Keywords: wrinkles, telangiectasia, pigmentation, pore closing
Procedia PDF Downloads 863098 Fabrication of Coatable Polarizer by Guest-Host System for Flexible Display Applications
Authors: Rui He, Seung-Eun Baik, Min-Jae Lee, Myong-Hoon Lee
Abstract:
The polarizer is one of the most essential optical elements in LCDs. Currently, the most widely used polarizers for LCD is the derivatives of the H-sheet polarizer. There is a need for coatable polarizers which are much thinner and more stable than H-sheet polarizers. One possible approach to obtain thin, stable, and coatable polarizers is based on the use of highly ordered guest-host system. In our research, we aimed to fabricate coatable polarizer based on highly ordered liquid crystalline monomer and dichroic dye ‘guest-host’ system, in which the anisotropic absorption of light could be achieved by aligning a dichroic dye (guest) in the cooperative motion of the ordered liquid crystal (host) molecules. Firstly, we designed and synthesized a new reactive liquid crystalline monomer containing polymerizable acrylate groups as the ‘host’ material. The structure was confirmed by 1H-NMR and IR spectroscopy. The liquid crystalline behavior was studied by differential scanning calorimetry (DSC) and polarized optical microscopy (POM). It was confirmed that the monomers possess highly ordered smectic phase at relatively low temperature. Then, the photocurable ‘guest-host’ system was prepared by mixing the liquid crystalline monomer, dichroic dye and photoinitiator. Coatable polarizers were fabricated by spin-coating above mixture on a substrate with alignment layer. The in-situ photopolymerization was carried out at room temperature by irradiating UV light, resulting in the formation of crosslinked structure that stabilized the aligned dichroic dye molecules. Finally, the dichroic ratio (DR), order parameter (S) and polarization efficiency (PE) were determined by polarized UV/Vis spectroscopy. We prepared the coatable polarizers by using different type of dichroic dyes to meet the requirement of display application. The results reveal that the coatable polarizers at a thickness of 8μm exhibited DR=12~17 and relatively high PE (>96%) with the highest PE=99.3%, which possess potential for the LCD or flexible display applications.Keywords: coatable polarizer, display, guest-host, liquid crystal
Procedia PDF Downloads 2543097 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language
Authors: Wenjun Hou, Marek Perkowski
Abstract:
The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language
Procedia PDF Downloads 1933096 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 223095 Multi-Level Clustering Based Congestion Control Protocol for Cyber Physical Systems
Authors: Manpreet Kaur, Amita Rani, Sanjay Kumar
Abstract:
The Internet of Things (IoT), a cyber-physical paradigm, allows a large number of devices to connect and send the sensory data in the network simultaneously. This tremendous amount of data generated leads to very high network load consequently resulting in network congestion. It further amounts to frequent loss of useful information and depletion of significant amount of nodes’ energy. Therefore, there is a need to control congestion in IoT so as to prolong network lifetime and improve the quality of service (QoS). Hence, we propose a two-level clustering based routing algorithm considering congestion score and packet priority metrics that focus on minimizing the network congestion. In the proposed Priority based Congestion Control (PBCC) protocol the sensor nodes in IoT network form clusters that reduces the amount of traffic and the nodes are prioritized to emphasize important data. Simultaneously, a congestion score determines the occurrence of congestion at a particular node. The proposed protocol outperforms the existing Packet Discard Network Clustering (PDNC) protocol in terms of buffer size, packet transmission range, network region and number of nodes, under various simulation scenarios.Keywords: internet of things, cyber-physical systems, congestion control, priority, transmission rate
Procedia PDF Downloads 3103094 Building and Development of the Stock Market Institutional Infrastructure in Russia
Authors: Irina Bondarenko, Olga Vandina
Abstract:
The theory of evolutionary economics is the basis for preparation and application of methods forming the stock market infrastructure development concept. The authors believe that the basis for the process of formation and development of the stock market model infrastructure in Russia is the theory of large systems. This theory considers the financial market infrastructure as a whole on the basis of macroeconomic approach with the further definition of its aims and objectives. Evaluation of the prospects for interaction of securities market institutions will enable identifying the problems associated with the development of this system. The interaction of elements of the stock market infrastructure allows to reduce the costs and time of transactions, thereby freeing up resources of market participants for more efficient operation. Thus, methodology of the transaction analysis allows to determine the financial infrastructure as a set of specialized institutions that form a modern quasi-stable system. The financial infrastructure, based on international standards, should include trading systems, regulatory and supervisory bodies, rating agencies, settlement, clearing and depository organizations. Distribution of financial assets, reducing the magnitude of transaction costs, increased transparency of the market are promising tasks in the solution for questions of services level and quality increase provided by institutions of the securities market financial infrastructure. In order to improve the efficiency of the regulatory system, it is necessary to provide "standards" for all market participants. The development of a clear regulation for the barrier to the stock market entry and exit, provision of conditions for the development and implementation of new laws regulating the activities of participants in the securities market, as well as formulation of proposals aimed at minimizing risks and costs, will enable the achievement of positive results. The latter will be manifested in increasing the level of market participant security and, accordingly, the attractiveness of this market for investors and issuers.Keywords: institutional infrastructure, financial assets, regulatory system, stock market, transparency of the market
Procedia PDF Downloads 1363093 An Adaptive Oversampling Technique for Imbalanced Datasets
Authors: Shaukat Ali Shahee, Usha Ananthakumar
Abstract:
A data set exhibits class imbalance problem when one class has very few examples compared to the other class, and this is also referred to as between class imbalance. The traditional classifiers fail to classify the minority class examples correctly due to its bias towards the majority class. Apart from between-class imbalance, imbalance within classes where classes are composed of a different number of sub-clusters with these sub-clusters containing different number of examples also deteriorates the performance of the classifier. Previously, many methods have been proposed for handling imbalanced dataset problem. These methods can be classified into four categories: data preprocessing, algorithmic based, cost-based methods and ensemble of classifier. Data preprocessing techniques have shown great potential as they attempt to improve data distribution rather than the classifier. Data preprocessing technique handles class imbalance either by increasing the minority class examples or by decreasing the majority class examples. Decreasing the majority class examples lead to loss of information and also when minority class has an absolute rarity, removing the majority class examples is generally not recommended. Existing methods available for handling class imbalance do not address both between-class imbalance and within-class imbalance simultaneously. In this paper, we propose a method that handles between class imbalance and within class imbalance simultaneously for binary classification problem. Removing between class imbalance and within class imbalance simultaneously eliminates the biases of the classifier towards bigger sub-clusters by minimizing the error domination of bigger sub-clusters in total error. The proposed method uses model-based clustering to find the presence of sub-clusters or sub-concepts in the dataset. The number of examples oversampled among the sub-clusters is determined based on the complexity of sub-clusters. The method also takes into consideration the scatter of the data in the feature space and also adaptively copes up with unseen test data using Lowner-John ellipsoid for increasing the accuracy of the classifier. In this study, neural network is being used as this is one such classifier where the total error is minimized and removing the between-class imbalance and within class imbalance simultaneously help the classifier in giving equal weight to all the sub-clusters irrespective of the classes. The proposed method is validated on 9 publicly available data sets and compared with three existing oversampling techniques that rely on the spatial location of minority class examples in the euclidean feature space. The experimental results show the proposed method to be statistically significantly superior to other methods in terms of various accuracy measures. Thus the proposed method can serve as a good alternative to handle various problem domains like credit scoring, customer churn prediction, financial distress, etc., that typically involve imbalanced data sets.Keywords: classification, imbalanced dataset, Lowner-John ellipsoid, model based clustering, oversampling
Procedia PDF Downloads 4183092 On-Ice Force-Velocity Modeling Technical Considerations
Authors: Dan Geneau, Mary Claire Geneau, Seth Lenetsky, Ming -Chang Tsai, Marc Klimstra
Abstract:
Introduction— Horizontal force-velocity profiling (HFVP) involves modeling an athletes linear sprint kinematics to estimate valuable maximum force and velocity metrics. This approach to performance modeling has been used in field-based team sports and has recently been introduced to ice-hockey as a forward skating performance assessment. While preliminary data has been collected on ice, distance constraints of the on-ice test restrict the ability of the athletes to reach their maximal velocity which result in limits of the model to effectively estimate athlete performance. This is especially true of more elite athletes. This report explores whether athletes on-ice are able to reach a velocity plateau similar to what has been seen in overground trials. Fourteen male Major Junior ice-hockey players (BW= 83.87 +/- 7.30 kg, height = 188 ± 3.4cm cm, age = 18 ± 1.2 years n = 14) were recruited. For on-ice sprints, participants completed a standardized warm-up consisting of skating and dynamic stretching and a progression of three skating efforts from 50% to 95%. Following the warm-up, participants completed three on ice 45m sprints, with three minutes of rest in between each trial. For overground sprints, participants completed a similar dynamic warm-up to that of on-ice trials. Following the warm-up participants completed three 40m overground sprint trials. For each trial (on-ice and overground), radar was used to collect instantaneous velocity (Stalker ATS II, Texas, USA) aimed at the participant’s waist. Sprint velocities were modelled using custom Python (version 3.2) script using a mono-exponential function, similar to previous work. To determine if on-ice tirals were achieving a maximum velocity (plateau), minimum acceleration values of the modeled data at the end of the sprint were compared (using paired t-test) between on-ice and overground trials. Significant differences (P<0.001) between overground and on-ice minimum accelerations were observed. It was found that on-ice trials consistently reported higher final acceleration values, indicating a maximum maintained velocity (plateau) had not been reached. Based on these preliminary findings, it is suggested that reliable HFVP metrics cannot yet be collected from all ice-hockey populations using current methods. Elite male populations were not able to achieve a velocity plateau similar to what has been seen in overground trials, indicating the absence of a maximum velocity measure. With current velocity and acceleration modeling techniques, including a dependency of a velocity plateau, these results indicate the potential for error in on-ice HFVP measures. Therefore, these findings suggest that a greater on-ice sprint distance may be required or the need for other velocity modeling techniques, where maximal velocity is not required for a complete profile.Keywords: ice-hockey, sprint, skating, power
Procedia PDF Downloads 1043091 PLGA Nanoparticles Entrapping dual anti-TB drugs of Amikacin and Moxifloxacin as a Potential Host-Directed Therapy for Multidrug Resistant Tuberculosis
Authors: Sharif Abdelghany
Abstract:
Polymeric nanoparticles have been widely investigated as a controlled release drug delivery platform for the treatment of tuberculosis (TB). These nanoparticles were also readily internalised into macrophages, leading to high intracellular drug concentration. In this study two anti-TB drugs, amikacin and moxifloxacin were encapsulated into PLGA nanoparticles. The novelty of this work appears in: (1) the efficient encapsulation of two hydrophilic second-line anti-TB drugs, and (2) intramacrophage delivery of this synergistic combination potentially for rapid treatment of multi-drug resistant TB (MDR-TB). Two water-oil-water (w/o/w) emulsion strategies were employed in this study: (1) alginate coated PLGA nanoparticles, and (2) alginate entrapped PLGA nanoparticles. The average particle size and polydispersity index (PDI) of the alginate coated PLGA nanoparticles were found to be unfavourably high with values of 640 ± 32 nm and 0.63 ± 0.09, respectively. In contrast, the alginate entrapped PLGA nanoparticles were within the desirable particle size range of 282 - 315 nm and the PDI was 0.08 - 0.16, and therefore were chosen for subsequent studies. Alginate entrapped PLGA nanoparticles yielded a drug loading of over 10 µg/mg powder for amikacin, and more than 5 µg/mg for moxifloxacin and entrapment efficiencies range of approximately 25-31% for moxifloxacin and 51-59% for amikacin. To study macrophage uptake efficiency, the nanoparticles of alginate entrapped nanoparticle formulation were loaded with acridine orange as a marker, seeded to THP-1 derived macrophages and viewed under confocal microscopy. The particles were readily internalised into the macrophages and highly concentrated in the nucleus region. Furthermore, the anti-mycobacterial activity of the drug-loaded particles was evaluated using M. tuberculosis-infected macrophages, which revealed a significant reduction (4 log reduction) of viable bacterial count compared to the untreated group. In conclusion, the amikacin-moxifloxacin alginate entrapped PLGA nanoparticles are promising for further in vivo studies.Keywords: moxifloxacin and amikacin, nanoparticles, multidrug resistant TB, PLGA
Procedia PDF Downloads 3703090 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud
Authors: Sharda Kumari, Saiman Shetty
Abstract:
Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation
Procedia PDF Downloads 1123089 Social Problems and Gender Wage Gap Faced by Working Women in Readymade Garment Sector of Pakistan
Authors: Narjis Kahtoon
Abstract:
The issue of the wage discrimination on the basis of gender and social problem has been a significant research problem for several decades. Whereas lots of have explored reasons for the persistence of an inequality in the wages of male and female, none has successfully explained away the entire differentiation. The wage discrimination on the basis of gender and social problem of working women is a global issue. Although inequality in political and economic and social make-up of countries all over the world, the gender wage discrimination, and social constraint is present. The aim of the research is to examine the gender wage discrimination and social constraint from an international perspective and to determine whether any pattern exists among cultural dimensions of a country and the man and women remuneration gap in Readymade Garment Sector of Pakistan. Population growth rate is significant indicator used to explain the change in population and play a crucial point in the economic development of a country. In Pakistan, readymade garment sector consists of small, medium and large sized firms. With an estimated 30 percent of the workforce in textile- Garment is females’. Readymade garment industry is a labor intensive industry and relies on the skills of individual workers and provides highest value addition in the textile sector. In the Garment sector, female workers are concentrated in poorly paid, labor-intensive down-stream production (readymade garments, linen, towels, etc.), while male workers dominate capital- intensive (ginning, spinning and weaving) processes. Gender wage discrimination and social constraint are reality in Pakistan Labor Market. This research allows us not only to properly detect the size of gender wage discrimination and social constraint but to also fully understand its consequences in readymade garment sector of Pakistan. Furthermore, research will evaluated this measure for the three main clusters like Lahore, Karachi, and Faisalabad. These data contain complete details of male and female workers and supervisors in the readymade garment sector of Pakistan. These sources of information provide a unique opportunity to reanalyze the previous finding in the literature. The regression analysis focused on the standard 'Mincerian' earning equation and estimates it separately by gender, the research will also imply the cultural dimensions developed by Hofstede (2001) to profile a country’s cultural status and compare those cultural dimensions to the wage inequalities. Readymade garment of Pakistan is one of the important sectors since its products have huge demand at home and abroad. These researches will a major influence on the measures undertaken to design a public policy regarding wage discrimination and social constraint in readymade garment sector of Pakistan.Keywords: gender wage differentials, decomposition, garment, cultural
Procedia PDF Downloads 2113088 Validating the Cerebral Palsy Quality of Life for Children (CPQOL-Child) Questionnaire for Use in Sri Lanka
Authors: Shyamani Hettiarachchi, Gopi Kitnasamy
Abstract:
Background: The potentially high level of physical need and dependency experienced by children with cerebral palsy could affect the quality of life (QOL) of the child, the caregiver and his/her family. Poor QOL in children with cerebral palsy is associated with the parent-child relationship, limited opportunities for social participation, limited access to healthcare services, psychological well-being and the child's physical functioning. Given that children experiencing disabilities have little access to remedial support with an inequitable service across districts in Sri Lanka, and given the impact of culture and societal stigma, there may be differing viewpoints across respondents. Objectives: The aim of this study was to evaluate the psychometric properties of the Tamil version of the Cerebral Palsy Quality of Life for Children (CPQOL-Child) Questionnaire. Design: An instrument development and validation study. Methods: Forward and backward translations of the CPQOL-Child were undertaken by a team comprised of a physiotherapist, speech and language therapist and two linguists for the primary caregiver form and the child self-report form. As part of a pilot phase, the Tamil version of the CPQOL was completed by 45 primary caregivers with children with cerebral palsy and 15 children with cerebral palsy (GMFCS level 3-4). In addition, the primary caregivers commented on the process of filling in the questionnaire. The psychometric properties of test-retest reliability, internal consistency and construct validity were undertaken. Results: The test-retest reliability and internal consistency were high. A significant association (p < 0.001) was found between limited motor skills and poor QOL. The Cronbach's alpha for the whole questionnaire was at 0.95.Similarities and divergences were found between the two groups of respondents. The child respondents identified limited motor skills as associated with physical well-being and autonomy. Akin to this, the primary caregivers associated the severity of motor function with limitations of physical well-being and autonomy. The trend observed was that QOL was not related to the level of impairment but connected to environmental factors by the child respondents. In addition to this, the main concern among primary caregivers about the child's future and on the child's lack of independence was not fully captured by the QOL questionnaire employed. Conclusions: Although the initial results of the CPQOL questionnaire show high test-retest reliability and internal consistency of the instrument, it does not fully reflect the socio-cultural realities and primary concerns of the caregivers. The current findings highlight the need to take child and caregiver perceptions of QOL into account in clinical practice and research. It strongly indicates the need for culture-specific measures of QOL.Keywords: cerebral palsy, CPQOL, culture, quality of life
Procedia PDF Downloads 3453087 The Rapid Industrialization Model
Authors: Fredrick Etyang
Abstract:
This paper presents a Rapid Industrialization Model (RIM) designed to support existing industrialization policies, strategies and industrial development plans at National, Regional and Constituent level in Africa. The model will reinforce efforts to attainment of inclusive and sustainable industrialization of Africa by state and non-state actors. The overall objective of this model is to serve as a framework for rapid industrialization in developing economies and the specific objectives range from supporting rapid industrialization development to promoting a structural change in the economy, a balanced regional industrial growth, achievement of local, regional and international competitiveness in areas of clear comparative advantage in industrial exports and ultimately, the RIM will serve as a step-by-step guideline for the industrialization of African Economies. This model is a product of a scientific research process underpinned by desk research through the review of African countries development plans, strategies, datasets, industrialization efforts and consultation with key informants. The rigorous research process unearthed multi-directional and renewed efforts towards industrialization of Africa premised on collective commitment of individual states, regional economic communities and the African union commission among other strategic stakeholders. It was further, established that the inputs into industrialization of Africa outshine the levels of industrial development on the continent. The RIM comes in handy to serve as step-by-step framework for African countries to follow in their industrial development efforts of transforming inputs into tangible outputs and outcomes in the short, intermediate and long-run. This model postulates three stages of industrialization and three phases toward rapid industrialization of African economies, the model is simple to understand, easily implementable and contextualizable with high return on investment for each unit invested into industrialization supported by the model. Therefore, effective implementation of the model will result into inclusive and sustainable rapid industrialization of Africa.Keywords: economic development, industrialization, economic efficiency, exports and imports
Procedia PDF Downloads 883086 Producing Outdoor Design Conditions based on the Dependency between Meteorological Elements: Copula Approach
Authors: Zhichao Jiao, Craig Farnham, Jihui Yuan, Kazuo Emura
Abstract:
It is common to use the outdoor design weather data to select the air-conditioning capacity in the building design stage. The outdoor design weather data are usually comprised of multiple meteorological elements for a 24-hour period separately, but the dependency between the elements is not well considered, which may cause an overestimation of selecting air-conditioning capacity. Considering the dependency between the air temperature and global solar radiation, we used the copula approach to model the joint distributions of those two weather elements and suggest a new method of selecting more credible outdoor design conditions based on the specific simultaneous occurrence probability of air temperature and global solar radiation. In this paper, the 10-year period hourly weather data from 2001 to 2010 in Osaka, Japan, was used to analyze the dependency structure and joint distribution, the result shows that the Joe-Frank copula fit for almost all hourly data. According to calculating the simultaneous occurrence probability and the common exceeding probability of air temperature and global solar radiation, the results have shown that the maximum difference in design air temperature and global solar radiation of the day is about 2 degrees Celsius and 30W/m2, respectively.Keywords: energy conservation, design weather database, HVAC, copula approach
Procedia PDF Downloads 2723085 Dietary Nutrient Consumption Patterns by the Pregnant Mother in Dhaka City, Bangladesh
Authors: Kazi Muhammad Rezaul Karim, Tasmia Tasnim
Abstract:
Introduction: Pregnancy is a condition of higher nutrient requirement but in developing countries like Bangladesh most of the pregnant women can not meet their nutrient requirement and sometimes they are neglected in the family. The purpose of the study was to assess the nutritional status and dietary nutrient intake by the pregnant women, in Dhaka city, Bangladesh. Methods: The study population comprised of pregnant women from urban or semi-urban, aged between 18 to 35 and free of pregnancy related complication and other diseases. Under a cross-sectional design, 30 healthy non-pregnant as well as 130 pregnant women, at 3 different trimesters of pregnancy were assessed. A questionnaire was developed to obtain demographic, socio-economic, anthropometric, drug and medical history. Three day consecutive 24-hour food recalls were used to assess food intake and then converted to nutrient intake. Results: The average BMI of the nonpregnant women was 22.89 ± 3.4 kg/m2 and that of pregnant women was 23.52 ± 3.71 kg/m2. The mean dietary nutrient intake of dietary fiber, calorie, protein, fat, carbohydrate, calcium, iron, thiamine, riboflavin, vitamin C, Vitamin A, folate, vitamin B6 and Vitamin B12 of the pregnant mothers were 4.38 g, 1619 kcal, 60.05 g, 30.38 g, 268.79 g, 537.21 mg, 21.53 mg, 1.15 mg, 0.94 mg, 97.36 mg, 647.6 µg, 153.93 µg, 1.41 mg and 4.09 µg respectively. Most of pregnant women (more than 90%) can not meet their energy, calcium and folate requirements. Conclusion: Most of the pregnant mother in Bangladesh can not meet their dietary requirements during pregnancy.Keywords: pregnancy, dietary nutrient, nutritional status, BMI
Procedia PDF Downloads 4423084 Water-Repellent Finishing on Cotton Fabric by SF₆ Plasma
Authors: We'aam Alali, Ziad Saffour, Saker Saloum
Abstract:
Low-pressure, sulfur hexafluoride (SF₆) remote radio-frequency (RF) plasma, ignited in a hollow cathode discharge (HCD-L300) plasma system, has been shown to be a powerful method in cotton fabric finishing to achieve water-repellent property. This plasma was ignited at an SF6 flow rate of (200 cm), low pressure (0.5 mbar), and radio frequency (13.56 MHz) with a power of (300 W). The contact angle has been measured as a function of the plasma exposure period using the water contact angle measuring device (WCA), and the changes in the morphology, chemical structure, and mechanical properties as tensile strength and elongation at the break of the fabric have also been investigated using the scanning electron microscope (SEM), energy-dispersive X-ray spectroscopy (EDX), attenuated total reflectance Fourier transform Infrared spectroscopy (ATR-FTIR), and tensile test device, respectively. In addition, weight loss of the fabric and the fastness of washing have been studied. It was found that the exposure period of the fabric to the plasma is an important parameter. Moreover, a good water-repellent cotton fabric can be obtained by treating it with SF₆ plasma for a short time (1 min) without degrading its mechanical properties. Regarding the modified morphology of the cotton fabric, it was found that grooves were formed on the surface of the fibers after treatment. Chemically, the fluorine atoms were attached to the surface of the fibers.Keywords: cotton fabric, SEM, SF₆ plasma, water-repellency
Procedia PDF Downloads 833083 Visualization of Wave Propagation in Monocoupled System with Effective Negative Stiffness, Effective Negative Mass, and Inertial Amplifier
Authors: Abhigna Bhatt, Arnab Banerjee
Abstract:
A periodic system with only a single coupling degree of freedom is called a monocoupled system. Monocoupled systems with mechanisms like mass in the mass system generates effective negative mass, mass connected with rigid links generates inertial amplification, and spring-mass connected with a rigid link generateseffective negative stiffness. In this paper, the representative unit cell is introduced, considering all three mechanisms combined. Further, the dynamic stiffness matrix of the unit cell is constructed, and the dispersion relation is obtained by applying the Bloch theorem. The frequency response function is also calculated for the finite length of periodic unit cells. Moreover, the input displacement signal is given to the finite length of periodic structure and using inverse Fourier transform to visualize the wave propagation in the time domain. This visualization explains the sudden attenuation in metamaterial due to energy dissipation by an embedded resonator at the resonance frequency. The visualization created for wave propagation is found necessary to understand the insights of physics behind the attenuation characteristics of the system.Keywords: mono coupled system, negative effective mass, negative effective stiffness, inertial amplifier, fourier transform
Procedia PDF Downloads 1293082 Guidelines for Enhancing the Learning Environment by the Integration of Design Flexibility and Immersive Technology: The Case of the British University in Egypt’s Classrooms
Authors: Eman Ayman, Gehan Nagy
Abstract:
The learning environment has four main parameters that affect its efficiency which they are: pedagogy, user, technology, and space. According to Morrone, enhancing these parameters to be adaptable for future developments is essential. The educational organization will be in need of developing its learning spaces. Flexibility of design an immersive technology could be used as tools for this development. when flexible design concepts are used, learning spaces that can accommodate a variety of teaching and learning activities are created. To accommodate the various needs and interests of students, these learning spaces are easily reconfigurable and customizable. The immersive learning opportunities offered by technologies like virtual reality, augmented reality, and interactive displays, on the other hand, transcend beyond the confines of the traditional classroom. These technological advancements could improve learning. This thesis highlights the problem of the lack of innovative, flexible learning spaces in educational institutions. It aims to develop guidelines for enhancing the learning environment by the integration of flexible design and immersive technology. This research uses a mixed method approach, both qualitative and quantitative: the qualitative section is related to the literature review theories and case studies analysis. On the other hand, the quantitative section will be identified by the results of the applied studies of the effectiveness of redesigning a learning space from its traditional current state to a flexible technological contemporary space that will be adaptable to many changes and educational needs. Research findings determine the importance of flexibility in learning spaces' internal design as it enhances the space optimization and capability to accommodate the changes and record the significant contribution of immersive technology that assists the process of designing. It will be summarized by the questionnaire results and comparative analysis, which will be the last step of finalizing the guidelines.Keywords: flexibility, learning space, immersive technology, learning environment, interior design
Procedia PDF Downloads 983081 A Comparative Human Rights Analysis of the Securitization of Migration in the Fight against Terrorism in Europe: An Evaluation of Belgium
Authors: Louise Reyntjens
Abstract:
The last quarter of the twentieth century was characterized by the emergence of a new kind of terrorism: religiously-inspired terrorism. Islam finds itself at the heart of this new wave, considering the number of international attacks committed by Islamic-inspired perpetrators. With religiously inspired terrorism as an operating framework, governments increasingly rely on immigration law to counter such terrorism. Immigration law seems particularly useful because its core task consists of keeping ‘unwanted’ people out. Islamic terrorists more often than not have an immigrant background and will be subject to immigration law. As a result, immigration law becomes more and more ‘securitized’. The European migration crisis has reinforced this trend. The research explores the human rights consequences of immigration law’s securitization in Europe. For this, the author selected four European countries for a comparative study: Belgium, France, the United Kingdom and Sweden. All these countries face similar social and security issues but respond very differently to them. The United Kingdom positions itself on the repressive side of the spectrum. Sweden on the other hand also introduced restrictions to its immigration policy but remains on the tolerant side of the spectrum. Belgium and France are situated in between. This contribution evaluates the situation in Belgium. Through a series of legislative changes, the Belgian parliament (i) greatly expanded the possibilities of expelling foreign nationals for (vaguely defined) reasons of ‘national security’; (ii) abolished almost all procedural protection associated with this decision (iii) broadened, as an extra security measure, the possibility of depriving individuals condemned of terrorism of their Belgian nationality. Measures such as these are obviously problematic from a human rights perspective; they jeopardize the principle of legality, the presumption of innocence, the right to protection of private and family life and the prohibition on torture. Moreover, this contribution also raises questions about the efficacy of immigration law’s suitability as a counterterrorism instrument. Is it a legitimate step, considering the type of terrorism we face today? Or, is it merely a strategic move, considering the broader maneuvering space immigration law offers and the lack of political resistance governments receive when infringing the rights of foreigners? Even more so, figures demonstrate that today’s terrorist threat does not necessarily stem from outside our borders. Does immigration law then still absorb - if it has ever done so (completely) - the threat? The study’s goal is to critically assess, from a human rights perspective, the counterterrorism strategies European governments have adopted. As most governments adopt a variation of the same core concepts, the study’s findings will hold true even beyond the four countries addressed.Keywords: Belgium, counterterrorism strategies, human rights, immigration law
Procedia PDF Downloads 1073080 Conventional and Computational Investigation of the Synthesized Organotin(IV) Complexes Derived from o-Vanillin and 3-Nitro-o-Phenylenediamine
Authors: Harminder Kaur, Manpreet Kaur, Akanksha Kapila, Reenu
Abstract:
Schiff base with general formula H₂L was derived from condensation of o-vanillin and 3-nitro-o-phenylenediamine. This Schiff base was used for the synthesis of organotin(IV) complexes with general formula R₂SnL [R=Phenyl or n-octyl] using equimolar quantities. Elemental analysis UV-Vis, FTIR, and multinuclear spectroscopic techniques (¹H, ¹³C, and ¹¹⁹Sn) NMR were carried out for the characterization of the synthesized complexes. These complexes were coloured and soluble in polar solvents. Computational studies have been performed to obtain the details of the geometry and electronic structures of ligand as well as complexes. Geometry of the ligands and complexes have been optimized at the level of Density Functional Theory with B3LYP/6-311G (d,p) and B3LYP/MPW1PW91 respectively followed by vibrational frequency analysis using Gaussian 09. Observed ¹¹⁹Sn NMR chemical shifts of one of the synthesized complexes showed tetrahedral geometry around Tin atom which is also confirmed by DFT. HOMO-LUMO energy distribution was calculated. FTIR, ¹HNMR and ¹³CNMR spectra were also obtained theoretically using DFT. Further IRC calculations were employed to determine the transition state for the reaction and to get the theoretical information about the reaction pathway. Moreover, molecular docking studies can be explored to ensure the anticancer activity of the newly synthesized organotin(IV) complexes.Keywords: DFT, molecular docking, organotin(IV) complexes, o-vanillin, 3-nitro-o-phenylenediamine
Procedia PDF Downloads 1633079 Nuclear Terrorism Decision Making: A Comparative Study of South Asian Nuclear Weapons States
Authors: Muhammad Jawad Hashmi
Abstract:
The idea of nuclear terrorism is as old as nuclear weapons but the global concerns of likelihood of nuclear terrorism are uncertain. Post 9/11 trends manifest that terrorists are believers of massive causalities. Innovation in terrorist’s tactics, sophisticated weaponry, vulnerability, theft and smuggling of nuclear/radiological material, connections between terrorists, black market and rough regimes are signaling seriousness of upcoming challenges as well as global trends of “terror-transnationalism.” Furthermore, the International-Atomic-Energy-Agency’s database recorded 2734 incidents regarding misuse, unauthorized possession, trafficking of nuclear material etc. Since, this data also includes incidents from south Asia, so, there is every possibility to claim that such illicit activities may increase in future, mainly due to expansion of nuclear industry in South Asia. Moreover, due to such mishaps the region is vulnerable to threats of nuclear terrorism. This is also a reason that the region is in limelight along with issues such as rapidly growing nuclear arsenals, nuclear safety and security, terrorism and political instability. With this backdrop, this study is aimed to investigate the prevailing threats and challenges in South Asia vis a vis nuclear safety and security. A comparative analysis of the overall capabilities would be done to identify the areas of cooperation to eliminate the probability of nuclear/radiological terrorism in the region.Keywords: nuclear terrorism, safety, security, South Asia, india, Pakistan
Procedia PDF Downloads 3593078 Applying Participatory Design for the Reuse of Deserted Community Spaces
Authors: Wei-Chieh Yeh, Yung-Tang Shen
Abstract:
The concept of community building started in 1994 in Taiwan. After years of development, it fostered the notion of active local resident participation in community issues as co-operators, instead of minions. Participatory design gives participants more control in the decision-making process, helps to reduce the friction caused by arguments and assists in bringing different parties to consensus. This results in an increase in the efficiency of projects run in the community. Therefore, the participation of local residents is key to the success of community building. This study applied participatory design to develop plans for the reuse of deserted spaces in the community from the first stage of brainstorming for design ideas, making creative models to be employed later, through to the final stage of construction. After conducting a series of participatory designed activities, it aimed to integrate the different opinions of residents, develop a sense of belonging and reach a consensus. Besides this, it also aimed at building the residents’ awareness of their responsibilities for the environment and related issues of sustainable development. By reviewing relevant literature and understanding the history of related studies, the study formulated a theory. It took the “2012-2014 Changhua County Community Planner Counseling Program” as a case study to investigate the implementation process of participatory design. Research data are collected by document analysis, participants’ observation and in-depth interviews. After examining the three elements of “Design Participation”, “Construction Participation”, and” Follow–up Maintenance Participation” in the case, the study emerged with a promising conclusion: Maintenance works were carried out better compared to common public works. Besides this, maintenance costs were lower. Moreover, the works that residents were involved in were more creative. Most importantly, the community characteristics could be easy be recognized.Keywords: participatory design, deserted space, community building, reuse
Procedia PDF Downloads 3753077 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis
Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha
Abstract:
Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.Keywords: shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust
Procedia PDF Downloads 1293076 Preliminary Study of Hand Gesture Classification in Upper-Limb Prosthetics Using Machine Learning with EMG Signals
Authors: Linghui Meng, James Atlas, Deborah Munro
Abstract:
There is an increasing demand for prosthetics capable of mimicking natural limb movements and hand gestures, but precise movement control of prosthetics using only electrode signals continues to be challenging. This study considers the implementation of machine learning as a means of improving accuracy and presents an initial investigation into hand gesture recognition using models based on electromyographic (EMG) signals. EMG signals, which capture muscle activity, are used as inputs to machine learning algorithms to improve prosthetic control accuracy, functionality and adaptivity. Using logistic regression, a machine learning classifier, this study evaluates the accuracy of classifying two hand gestures from the publicly available Ninapro dataset using two-time series feature extraction algorithms: Time Series Feature Extraction (TSFE) and Convolutional Neural Networks (CNNs). Trials were conducted using varying numbers of EMG channels from one to eight to determine the impact of channel quantity on classification accuracy. The results suggest that although both algorithms can successfully distinguish between hand gesture EMG signals, CNNs outperform TSFE in extracting useful information for both accuracy and computational efficiency. In addition, although more channels of EMG signals provide more useful information, they also require more complex and computationally intensive feature extractors and consequently do not perform as well as lower numbers of channels. The findings also underscore the potential of machine learning techniques in developing more effective and adaptive prosthetic control systems.Keywords: EMG, machine learning, prosthetic control, electromyographic prosthetics, hand gesture classification, CNN, computational neural networks, TSFE, time series feature extraction, channel count, logistic regression, ninapro, classifiers
Procedia PDF Downloads 393075 Household Climate-Resilience Index Development for the Health Sector in Tanzania: Use of Demographic and Health Surveys Data Linked with Remote Sensing
Authors: Heribert R. Kaijage, Samuel N. A. Codjoe, Simon H. D. Mamuya, Mangi J. Ezekiel
Abstract:
There is strong evidence that climate has changed significantly affecting various sectors including public health. The recommended feasible solution is adopting development trajectories which combine both mitigation and adaptation measures for improving resilience pathways. This approach demands a consideration for complex interactions between climate and social-ecological systems. While other sectors such as agriculture and water have developed climate resilience indices, the public health sector in Tanzania is still lagging behind. The aim of this study was to find out how can we use Demographic and Health Surveys (DHS) linked with Remote Sensing (RS) technology and metrological information as tools to inform climate change resilient development and evaluation for the health sector. Methodological review was conducted whereby a number of studies were content analyzed to find appropriate indicators and indices for climate resilience household and their integration approach. These indicators were critically reviewed, listed, filtered and their sources determined. Preliminary identification and ranking of indicators were conducted using participatory approach of pairwise weighting by selected national stakeholders from meeting/conferences on human health and climate change sciences in Tanzania. DHS datasets were retrieved from Measure Evaluation project, processed and critically analyzed for possible climate change indicators. Other sources for indicators of climate change exposure were also identified. For the purpose of preliminary reporting, operationalization of selected indicators was discussed to produce methodological approach to be used in resilience comparative analysis study. It was found that household climate resilient index depends on the combination of three indices namely Household Adaptive and Mitigation Capacity (HC), Household Health Sensitivity (HHS) and Household Exposure Status (HES). It was also found that, DHS alone cannot complement resilient evaluation unless integrated with other data sources notably flooding data as a measure of vulnerability, remote sensing image of Normalized Vegetation Index (NDVI) and Metrological data (deviation from rainfall pattern). It can be concluded that if these indices retrieved from DHS data sets are computed and scientifically integrated can produce single climate resilience index and resilience maps could be generated at different spatial and time scales to enhance targeted interventions for climate resilient development and evaluations. However, further studies are need to test for the sensitivity of index in resilience comparative analysis among selected regions.Keywords: climate change, resilience, remote sensing, demographic and health surveys
Procedia PDF Downloads 167