Search results for: propulsive efficiency
1341 Effects of Neem (Azadirachta indica A. Juss) Kernel Inclusion in Broiler Diet on Growth Performance, Organ Weight and Gut Morphometry
Authors: Olatundun Bukola Ezekiel, Adejumo Olusoji
Abstract:
A feeding trial was conducted with 100 two-weeks old broiler chicken to evaluate the influence of inclusion in broiler diets at 0, 2.5, 5, 7.5 and 10% neem kernel (used to replace equal quantity of maize) on their performance, organ weight and gut morphometry. The birds were randomly allotted to five dietary treatments, each treatment having four replicates consisting of five broilers in a completely randomized design. The diets were formulated to be iso-nitrogenous (23% CP). Weekly feed intake and changes in body weight were calculated and feed efficiency determined. At the end of the 28-day feeding trial, four broilers per treatment were selected and sacrificed for carcass evaluation. Results were subjected to statistical analysis using the analysis of variance procedures of Statistical Analysis Software The treatment means were presented with group standard errors of means and where significant, were compared using the Duncan multiple range test of the same software. The results showed that broilers fed 2.5% neem kernel inclusion diets had growth performance statistically comparable to those fed the control diet. Birds on 5, 7.5 and 10% neem kernel diets showed significant (P<0.05) increase in relative weight of liver. The absolute weight of spleen also increased significantly (P<0.05) in birds on 10 % neem kernel diet. More than 5 % neem kernel diets gave significant (P<0.05) increase in the relative weight of the kidney. The length of the small intestine significantly increased in birds fed 7.5 and 10% neem kernel diets. Significant differences (P<0.05) did not occur in the length of the large intestine, right and left caeca. It is recommended that neem kernel can be included up to 2.5% in broiler chicken diet without any deleterious effects on the performance and physiological status of the birds.Keywords: broiler chicken, growth performance, gut morphometry, neem kernel, organ weight
Procedia PDF Downloads 7661340 Synthesis, Characterization, Optical and Photophysical Properties of Pyrene-Labeled Ruthenium(Ii) Trisbipyridine Complex Cored Dendrimers
Authors: Mireille Vonlanthen, Pasquale Porcu, Ernesto Rivera
Abstract:
Dendritic macromolecules are presenting unique physical and chemical properties. One of them is the faculty of transferring energy from a donor moiety introduced at the periphery to an acceptor moiety at the core, mimicking the antenna effect of the process of photosynthesis. The mechanism of energy transfer is based on the Förster resonance energy exchange and requires some overlap between the emission spectrum of the donor and the absorption spectrum of the acceptor. Since it requires a coupling of transition dipole but no overlap of the physical wavefunctions, the energy transfer by Förster mechanism can occur over quite long distances from 1 to a maximum of 10 nm. However, the efficiency of the transfer depends strongly on distance. The Förster radius is the distance at which 50% of the donor’s emission is deactivated by FRET. In this work, we synthesized and characterized a novel series of dendrimers bearing pyrene moieties at the periphery and a Ru (II) complex at the core. The optical and photophysical properties of these compounds were studied by absorption and fluorescence spectroscopy. Pyrene is a well-studied chromophore that has the particularity to present monomer as well as excimer fluorescence emission. The coordination compounds of Ru (II) are red emitters with low quantum yield and long excited lifetime. We observed an efficient singulet to singulet energy transfer in such constructs. Moreover, it is known that the energy of the MLCT emitting state of Ru (II) can be tuned to become almost isoenegetic with respect to the triplet state of pyrene, leading to an extended phosphorescence lifetime. Using dendrimers bearing pyrene moieties as ligands for Ru (II), we could combine the antenna effect of dendrimers as well as its protection effect to the quenching by dioxygen with lifetime increase due to triplet-triplet equilibrium.Keywords: dendritic molecules, energy transfer, pyrene, ru-trisbipyridine complex
Procedia PDF Downloads 2781339 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 1381338 Metal-Based Deep Eutectic Solvents for Extractive Desulfurization of Fuels: Analysis from Molecular Dynamics Simulations
Authors: Aibek Kukpayev, Dhawal Shah
Abstract:
Combustion of sour fuels containing high amount of sulfur leads to the formation of sulfur oxides, which adversely harm the environment and has a negative impact on human health. Considering this, several legislations have been imposed to bring down the sulfur content in fuel to less than 10 ppm. In recent years, novel deep eutectic solvents (DESs) have been developed to achieve deep desulfurization, particularly to extract thiophenic compounds from liquid fuels. These novel DESs, considered as analogous to ionic liquids are green, eco-friendly, inexpensive, and sustainable. We herein, using molecular dynamic simulation, analyze the interactions of metal-based DESs with model oil consisting of thiophenic compounds. The DES used consists of polyethylene glycol (PEG-200) as a hydrogen bond donor, choline chloride (ChCl) or tetrabutyl ammonium chloride (TBAC) as a hydrogen bond acceptor, and cobalt chloride (CoCl₂) as metal salt. In particular, the combination of ChCl: PEG-200:CoCl₂ at a ratio 1:2:1 and the combination of TBAC:PEG-200:CoCl₂ at a ratio 1:2:0.25 were simulated, separately, with model oil consisting of octane and thiophenes at 25ᵒC and 1 bar. The results of molecular dynamics simulations were analyzed in terms of interaction energies between different components. The simulations revealed a stronger interaction between DESs/thiophenes as compared with octane/thiophenes, suggestive of an efficient desulfurization process. In addition, our analysis suggests that the choice of hydrogen bond acceptor strongly influences the efficiency of the desulfurization process. Taken together, the results also show the importance of the metal ion, although present in small amount, in the process, and the role of the polymer in desulfurization of the model fuel.Keywords: deep eutectic solvents, desulfurization, molecular dynamics simulations, thiophenes
Procedia PDF Downloads 1491337 Influence of the Adsorption of Anionic–Nonionic Surfactants/Silica Nanoparticles Mixture on Clay Rock Minerals in Chemical Enhanced Oil Recovery
Authors: C. Mendoza Ramírez, M. Gambús Ordaz, R. Mercado Ojeda.
Abstract:
Chemical solutions flooding with surfactants, based on their property of reducing the interfacial tension between crude oil and water, is a potential application of chemical enhanced oil recovery (CEOR), however, the high-rate retention of surfactants associated with adsorption in the porous medium and the complexity of the mineralogical composition of the reservoir rock generates a limitation in the efficiency of displacement of crude oil. This study evaluates the effect of the concentration of a mixture of anionic-non-ionic surfactants with silica nanoparticles, in a rock sample composed of 25.14% clay minerals of the kaolinite, chlorite, halloysite and montmorillonite type, according to the results of X-Ray Diffraction analysis and Scanning Electron Spectrometry (XRD and SEM, respectively). The amount of the surfactant mixture adsorbed on the clay rock minerals was analyzed from the construction of its calibration curve and the 4-Region Isotherm Model in a UV-Visible spectroscopy. The adsorption rate of the surfactant in the clay rock averages 32% across all concentrations, influenced by the presence of the surface area of the substrate with a value of 1.6 m2/g and by the mineralogical composition of the clay that increases the cation exchange capacity (CEC). In addition, on Region I and II a final concentration measurement is not evident in the UV-VIS, due to its ionic nature, its high affinity with the clay rock and its low concentration. Finally, for potential CEOR applications, the adsorption of these mixed surfactant systems is considered due to their industrial relevance and it is concluded that it is possible to use concentrations in Region III and IV; initially the adsorption has an increasing slope and then reaches zero in the equilibrium where interfacial tension values are reached in the order of x10-1 mN/m.Keywords: anionic–nonionic surfactants, clay rock, adsorption, 4-region isotherm model, cation exchange capacity, critical micelle concentration, enhanced oil recovery
Procedia PDF Downloads 721336 Combination Method Cold Plasma and Liquid Threads
Authors: Nino Tsamalaidze
Abstract:
Cold plasma is an ionized neutral gas with a temperature of 30-40 degrees, but the impact of HP includes not only gas, but also active molecules, charged particles, heat and UV radiation of low power The main goal of the technology we describe is to launch the natural function of skin regeneration and improve the metabolism inside, which leads to a huge effect of rejuvenation. In particular: eliminate fine mimic wrinkles; get rid of wrinkles around the mouth (purse-string wrinkles); reduce the overhang of the upper eyelid; eliminate bags under the eyes; provide a lifting effect on the oval of the face; reduce stretch marks; shrink pores; even out the skin, reduce the appearance of acne, scars; remove pigmentation. A clear indication of the major findings of the study is based on the current patients practice. The method is to use combination of cold plasma and liquid threats. The advantage of cold plasma is undoubtedly its efficiency, the result of its implementation can be compared with the result of a surgical facelift, despite the fact that the procedure is non-invasive and the risks are minimized. Another advantage is that the technique can be applied on the most sensitive skin of the face - these are the eyelids and the space around the eyes. Cold plasma is one of the few techniques that eliminates bags under the eyes and overhanging eyelids, while not violating the integrity of the tissues. In addition to rejuvenation and lifting effect, among the benefits of cold plasma is also getting rid of scars, kuperoze, stretch marks and other skin defects, plasma allows to get rid of acne, seborrhea, skin fungus and even heals ulcers. The cold plasma method makes it possible to achieve a result similar to blepharoplasty. Carried out on the skin of the eyelids, the procedure allows non-surgical correction of the eyelid line in 3-4 sessions. One of the undoubted advantages of this method is a short rehabilitation and rapid healing of the skin.Keywords: wrinkles, telangiectasia, pigmentation, pore closing
Procedia PDF Downloads 851335 Fabrication of Coatable Polarizer by Guest-Host System for Flexible Display Applications
Authors: Rui He, Seung-Eun Baik, Min-Jae Lee, Myong-Hoon Lee
Abstract:
The polarizer is one of the most essential optical elements in LCDs. Currently, the most widely used polarizers for LCD is the derivatives of the H-sheet polarizer. There is a need for coatable polarizers which are much thinner and more stable than H-sheet polarizers. One possible approach to obtain thin, stable, and coatable polarizers is based on the use of highly ordered guest-host system. In our research, we aimed to fabricate coatable polarizer based on highly ordered liquid crystalline monomer and dichroic dye ‘guest-host’ system, in which the anisotropic absorption of light could be achieved by aligning a dichroic dye (guest) in the cooperative motion of the ordered liquid crystal (host) molecules. Firstly, we designed and synthesized a new reactive liquid crystalline monomer containing polymerizable acrylate groups as the ‘host’ material. The structure was confirmed by 1H-NMR and IR spectroscopy. The liquid crystalline behavior was studied by differential scanning calorimetry (DSC) and polarized optical microscopy (POM). It was confirmed that the monomers possess highly ordered smectic phase at relatively low temperature. Then, the photocurable ‘guest-host’ system was prepared by mixing the liquid crystalline monomer, dichroic dye and photoinitiator. Coatable polarizers were fabricated by spin-coating above mixture on a substrate with alignment layer. The in-situ photopolymerization was carried out at room temperature by irradiating UV light, resulting in the formation of crosslinked structure that stabilized the aligned dichroic dye molecules. Finally, the dichroic ratio (DR), order parameter (S) and polarization efficiency (PE) were determined by polarized UV/Vis spectroscopy. We prepared the coatable polarizers by using different type of dichroic dyes to meet the requirement of display application. The results reveal that the coatable polarizers at a thickness of 8μm exhibited DR=12~17 and relatively high PE (>96%) with the highest PE=99.3%, which possess potential for the LCD or flexible display applications.Keywords: coatable polarizer, display, guest-host, liquid crystal
Procedia PDF Downloads 2531334 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 201333 Building and Development of the Stock Market Institutional Infrastructure in Russia
Authors: Irina Bondarenko, Olga Vandina
Abstract:
The theory of evolutionary economics is the basis for preparation and application of methods forming the stock market infrastructure development concept. The authors believe that the basis for the process of formation and development of the stock market model infrastructure in Russia is the theory of large systems. This theory considers the financial market infrastructure as a whole on the basis of macroeconomic approach with the further definition of its aims and objectives. Evaluation of the prospects for interaction of securities market institutions will enable identifying the problems associated with the development of this system. The interaction of elements of the stock market infrastructure allows to reduce the costs and time of transactions, thereby freeing up resources of market participants for more efficient operation. Thus, methodology of the transaction analysis allows to determine the financial infrastructure as a set of specialized institutions that form a modern quasi-stable system. The financial infrastructure, based on international standards, should include trading systems, regulatory and supervisory bodies, rating agencies, settlement, clearing and depository organizations. Distribution of financial assets, reducing the magnitude of transaction costs, increased transparency of the market are promising tasks in the solution for questions of services level and quality increase provided by institutions of the securities market financial infrastructure. In order to improve the efficiency of the regulatory system, it is necessary to provide "standards" for all market participants. The development of a clear regulation for the barrier to the stock market entry and exit, provision of conditions for the development and implementation of new laws regulating the activities of participants in the securities market, as well as formulation of proposals aimed at minimizing risks and costs, will enable the achievement of positive results. The latter will be manifested in increasing the level of market participant security and, accordingly, the attractiveness of this market for investors and issuers.Keywords: institutional infrastructure, financial assets, regulatory system, stock market, transparency of the market
Procedia PDF Downloads 1361332 PLGA Nanoparticles Entrapping dual anti-TB drugs of Amikacin and Moxifloxacin as a Potential Host-Directed Therapy for Multidrug Resistant Tuberculosis
Authors: Sharif Abdelghany
Abstract:
Polymeric nanoparticles have been widely investigated as a controlled release drug delivery platform for the treatment of tuberculosis (TB). These nanoparticles were also readily internalised into macrophages, leading to high intracellular drug concentration. In this study two anti-TB drugs, amikacin and moxifloxacin were encapsulated into PLGA nanoparticles. The novelty of this work appears in: (1) the efficient encapsulation of two hydrophilic second-line anti-TB drugs, and (2) intramacrophage delivery of this synergistic combination potentially for rapid treatment of multi-drug resistant TB (MDR-TB). Two water-oil-water (w/o/w) emulsion strategies were employed in this study: (1) alginate coated PLGA nanoparticles, and (2) alginate entrapped PLGA nanoparticles. The average particle size and polydispersity index (PDI) of the alginate coated PLGA nanoparticles were found to be unfavourably high with values of 640 ± 32 nm and 0.63 ± 0.09, respectively. In contrast, the alginate entrapped PLGA nanoparticles were within the desirable particle size range of 282 - 315 nm and the PDI was 0.08 - 0.16, and therefore were chosen for subsequent studies. Alginate entrapped PLGA nanoparticles yielded a drug loading of over 10 µg/mg powder for amikacin, and more than 5 µg/mg for moxifloxacin and entrapment efficiencies range of approximately 25-31% for moxifloxacin and 51-59% for amikacin. To study macrophage uptake efficiency, the nanoparticles of alginate entrapped nanoparticle formulation were loaded with acridine orange as a marker, seeded to THP-1 derived macrophages and viewed under confocal microscopy. The particles were readily internalised into the macrophages and highly concentrated in the nucleus region. Furthermore, the anti-mycobacterial activity of the drug-loaded particles was evaluated using M. tuberculosis-infected macrophages, which revealed a significant reduction (4 log reduction) of viable bacterial count compared to the untreated group. In conclusion, the amikacin-moxifloxacin alginate entrapped PLGA nanoparticles are promising for further in vivo studies.Keywords: moxifloxacin and amikacin, nanoparticles, multidrug resistant TB, PLGA
Procedia PDF Downloads 3701331 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud
Authors: Sharda Kumari, Saiman Shetty
Abstract:
Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation
Procedia PDF Downloads 1111330 The Rapid Industrialization Model
Authors: Fredrick Etyang
Abstract:
This paper presents a Rapid Industrialization Model (RIM) designed to support existing industrialization policies, strategies and industrial development plans at National, Regional and Constituent level in Africa. The model will reinforce efforts to attainment of inclusive and sustainable industrialization of Africa by state and non-state actors. The overall objective of this model is to serve as a framework for rapid industrialization in developing economies and the specific objectives range from supporting rapid industrialization development to promoting a structural change in the economy, a balanced regional industrial growth, achievement of local, regional and international competitiveness in areas of clear comparative advantage in industrial exports and ultimately, the RIM will serve as a step-by-step guideline for the industrialization of African Economies. This model is a product of a scientific research process underpinned by desk research through the review of African countries development plans, strategies, datasets, industrialization efforts and consultation with key informants. The rigorous research process unearthed multi-directional and renewed efforts towards industrialization of Africa premised on collective commitment of individual states, regional economic communities and the African union commission among other strategic stakeholders. It was further, established that the inputs into industrialization of Africa outshine the levels of industrial development on the continent. The RIM comes in handy to serve as step-by-step framework for African countries to follow in their industrial development efforts of transforming inputs into tangible outputs and outcomes in the short, intermediate and long-run. This model postulates three stages of industrialization and three phases toward rapid industrialization of African economies, the model is simple to understand, easily implementable and contextualizable with high return on investment for each unit invested into industrialization supported by the model. Therefore, effective implementation of the model will result into inclusive and sustainable rapid industrialization of Africa.Keywords: economic development, industrialization, economic efficiency, exports and imports
Procedia PDF Downloads 871329 Guidelines for Enhancing the Learning Environment by the Integration of Design Flexibility and Immersive Technology: The Case of the British University in Egypt’s Classrooms
Authors: Eman Ayman, Gehan Nagy
Abstract:
The learning environment has four main parameters that affect its efficiency which they are: pedagogy, user, technology, and space. According to Morrone, enhancing these parameters to be adaptable for future developments is essential. The educational organization will be in need of developing its learning spaces. Flexibility of design an immersive technology could be used as tools for this development. when flexible design concepts are used, learning spaces that can accommodate a variety of teaching and learning activities are created. To accommodate the various needs and interests of students, these learning spaces are easily reconfigurable and customizable. The immersive learning opportunities offered by technologies like virtual reality, augmented reality, and interactive displays, on the other hand, transcend beyond the confines of the traditional classroom. These technological advancements could improve learning. This thesis highlights the problem of the lack of innovative, flexible learning spaces in educational institutions. It aims to develop guidelines for enhancing the learning environment by the integration of flexible design and immersive technology. This research uses a mixed method approach, both qualitative and quantitative: the qualitative section is related to the literature review theories and case studies analysis. On the other hand, the quantitative section will be identified by the results of the applied studies of the effectiveness of redesigning a learning space from its traditional current state to a flexible technological contemporary space that will be adaptable to many changes and educational needs. Research findings determine the importance of flexibility in learning spaces' internal design as it enhances the space optimization and capability to accommodate the changes and record the significant contribution of immersive technology that assists the process of designing. It will be summarized by the questionnaire results and comparative analysis, which will be the last step of finalizing the guidelines.Keywords: flexibility, learning space, immersive technology, learning environment, interior design
Procedia PDF Downloads 971328 Flexible Feedstock Concept in Gasification Process for Carbon-Negative Energy Technology: A Case Study in Malaysia
Authors: Zahrul Faizi M. S., Ali A., Norhuda A. M.
Abstract:
Emission of greenhouse gases (GHG) from solid waste treatment and dependency on fossil fuel to produce electricity are the major concern in Malaysia as well as global. Innovation in downdraft gasification with combined heat and power (CHP) systems has the potential to minimize solid waste and reduce the emission of anthropogenic GHG from conventional fossil fuel power plants. However, the efficiency and capability of downdraft gasification to generate electricity from various alternative fuels, for instance, agriculture residues (i.e., woodchip, coconut shell) and municipal solid waste (MSW), are still controversial, on top of the toxicity level from the produced bottom ash. Thus this study evaluates the adaptability and reliability of the 20 kW downdraft gasification system to generate electricity (while considering environmental sustainability from the bottom ash) using flexible local feedstock at 20, 40, and 60% mixed ratio of MSW: agriculture residues. Feedstock properties such as feed particle size, moisture, and ash contents are also analyzed to identify optimal characteristics for the combination of feedstock (feedstock flexibility) to obtain maximum energy generation. Results show that the gasification system is capable to flexibly accommodate different feedstock compositions subjected to specific particle size (less than 2 inches) at a moisture content between 15 to 20%. These values exhibit enhance gasifier performance and provide a significant effect to the syngas composition utilizes by the internal combustion engine, which reflects energy production. The result obtained in this study is able to provide a new perspective on the transition of the conventional gasification system to a future reliable carbon-negative energy technology. Subsequently, promoting commercial scale-up of the downdraft gasification system.Keywords: carbon-negative energy, feedstock flexibility, gasification, renewable energy
Procedia PDF Downloads 1361327 Applying Participatory Design for the Reuse of Deserted Community Spaces
Authors: Wei-Chieh Yeh, Yung-Tang Shen
Abstract:
The concept of community building started in 1994 in Taiwan. After years of development, it fostered the notion of active local resident participation in community issues as co-operators, instead of minions. Participatory design gives participants more control in the decision-making process, helps to reduce the friction caused by arguments and assists in bringing different parties to consensus. This results in an increase in the efficiency of projects run in the community. Therefore, the participation of local residents is key to the success of community building. This study applied participatory design to develop plans for the reuse of deserted spaces in the community from the first stage of brainstorming for design ideas, making creative models to be employed later, through to the final stage of construction. After conducting a series of participatory designed activities, it aimed to integrate the different opinions of residents, develop a sense of belonging and reach a consensus. Besides this, it also aimed at building the residents’ awareness of their responsibilities for the environment and related issues of sustainable development. By reviewing relevant literature and understanding the history of related studies, the study formulated a theory. It took the “2012-2014 Changhua County Community Planner Counseling Program” as a case study to investigate the implementation process of participatory design. Research data are collected by document analysis, participants’ observation and in-depth interviews. After examining the three elements of “Design Participation”, “Construction Participation”, and” Follow–up Maintenance Participation” in the case, the study emerged with a promising conclusion: Maintenance works were carried out better compared to common public works. Besides this, maintenance costs were lower. Moreover, the works that residents were involved in were more creative. Most importantly, the community characteristics could be easy be recognized.Keywords: participatory design, deserted space, community building, reuse
Procedia PDF Downloads 3751326 Preliminary Study of Hand Gesture Classification in Upper-Limb Prosthetics Using Machine Learning with EMG Signals
Authors: Linghui Meng, James Atlas, Deborah Munro
Abstract:
There is an increasing demand for prosthetics capable of mimicking natural limb movements and hand gestures, but precise movement control of prosthetics using only electrode signals continues to be challenging. This study considers the implementation of machine learning as a means of improving accuracy and presents an initial investigation into hand gesture recognition using models based on electromyographic (EMG) signals. EMG signals, which capture muscle activity, are used as inputs to machine learning algorithms to improve prosthetic control accuracy, functionality and adaptivity. Using logistic regression, a machine learning classifier, this study evaluates the accuracy of classifying two hand gestures from the publicly available Ninapro dataset using two-time series feature extraction algorithms: Time Series Feature Extraction (TSFE) and Convolutional Neural Networks (CNNs). Trials were conducted using varying numbers of EMG channels from one to eight to determine the impact of channel quantity on classification accuracy. The results suggest that although both algorithms can successfully distinguish between hand gesture EMG signals, CNNs outperform TSFE in extracting useful information for both accuracy and computational efficiency. In addition, although more channels of EMG signals provide more useful information, they also require more complex and computationally intensive feature extractors and consequently do not perform as well as lower numbers of channels. The findings also underscore the potential of machine learning techniques in developing more effective and adaptive prosthetic control systems.Keywords: EMG, machine learning, prosthetic control, electromyographic prosthetics, hand gesture classification, CNN, computational neural networks, TSFE, time series feature extraction, channel count, logistic regression, ninapro, classifiers
Procedia PDF Downloads 381325 Biostimulant and Abiotic Plant Stress Interactions in Malting Barley: A Glasshouse Study
Authors: Conor Blunt, Mariluz del Pino-de Elias, Grace Cott, Saoirse Tracy, Rainer Melzer
Abstract:
The European Green Deal announced in 2021 details agricultural chemical pesticide use and synthetic fertilizer application to be reduced by 50% and 20% by 2030. Increasing and maintaining expected yields under these ambitious goals has strained the agricultural sector. This intergovernmental plan has identified plant biostimulants as one potential input to facilitate this new phase of sustainable agriculture; these products are defined as microorganisms or substances that can stimulate soil and plant functioning to enhance crop nutrient use efficiency, quality and tolerance to abiotic stresses. Spring barley is Ireland’s most widely sown tillage crop, and grain destined for malting commands the most significant market price. Heavy erratic rainfall is forecasted in Ireland’s climate future, and barley is particularly susceptible to waterlogging. Recent findings suggest that plant receptivity to biostimulants may depend on the level of stress inflicted on crops to elicit an assisted plant response. In this study, three biostimulants of different genesis (seaweed, protein hydrolysate and bacteria) are applied to ‘RGT Planet’ malting barley fertilized at three different rates (0 kg/ha, 40 kg/ha, 75 kg/ha) of calcium ammonium nitrogen (27% N) under non-stressed and waterlogged conditions. This 4x3x2 factorial trial design was planted in a completed randomized block with one plant per experimental unit. Leaf gas exchange data and key agronomic and grain quality parameters were analyzed via ANOVA. No penalty on productivity was evident on plants receiving 40 kg/ha of N and bio stimulant compared to 75 kg/ha of N treatments. The main effects of nitrogen application and waterlogging provided the most significant variation in the dataset.Keywords: biostimulant, Barley, malting, NUE, waterlogging
Procedia PDF Downloads 771324 Production and Characterization of Regenerated Cellulose Fiber from Pineapple Leaf Waste Using Dry-Jet-Wet Spinning
Authors: Roungpaisan, N., Witthayolankowit, K., Srisawat, Srichola, P., Rungruangkitkrai, Chartvivatpornchai, Suphamitmongkol W, Lobyam, Changniam C, Boonyarit, J., , Chollakup, R.
Abstract:
Thailand, a world leader in pineapple production and export, generates substantial amounts of pineapple leaf waste, a valuable source of cellulose fiber. This study investigates the production of high-quality dissolving pulp and regenerated cellulose fiber from pineapple leaf fiber using the eco-friendly lyocell process, which utilizes non-toxic, recyclable chemicals. The findings indicate that KOH can effectively replace NaOH in the pulping process, producing pulp with properties suitable for fiber spinning. Optimized bleaching sequences employing chlorine dioxide and hydrogen peroxide stages yielded bright, high-purity pulp with alpha-cellulose content comparable to commercial softwood pulp, along with higher viscosity and degree of polymerization. Lyocell fibers were successfully produced via dry-jet-wet spinning and compared to commercial lyocell fibers. These fibers exhibited similar density, color, and chemical structure but had larger dimensions, greater shrinkage, improved thermal stability, enhanced tensile strength, and superior methylene blue adsorption capacity. A market survey highlighted consumer interest in T-shirts made from sustainable lyocell fibers derived from agricultural waste, underscoring their environmental advantages. This study demonstrates a sustainable and innovative solution for repurposing agricultural waste into high-value textile products. Future work will focus on addressing the scalability and cost-efficiency of the process to facilitate its industrial application and expand its impact on sustainable textile manufacturing.Keywords: pineapple leaf fiber, dissolving pulp, regenerated cellulose, dry-jet wet spinning
Procedia PDF Downloads 41323 Chikungunya Virus Detection Utilizing an Origami Based Electrochemical Paper Analytical Device
Authors: Pradakshina Sharma, Jagriti Narang
Abstract:
Due to the critical significance in the early identification of infectious diseases, electrochemical sensors have garnered considerable interest. Here, we develop a detection platform for the chikungunya virus by rationally implementing the extremely high charge-transfer efficiency of a ternary nanocomposite of graphene oxide, silver, and gold (G/Ag/Au) (CHIKV). Because paper is an inexpensive substrate and can be produced in large quantities, the use of electrochemical paper analytical device (EPAD) origami further enhances the sensor's appealing qualities. A cost-effective platform for point-of-care diagnostics is provided by paper-based testing. These types of sensors are referred to as eco-designed analytical tools due to their efficient production, usage of the eco-friendly substrate, and potential to reduce waste management after measuring by incinerating the sensor. In this research, the paper's foldability property has been used to develop and create 3D multifaceted biosensors that can specifically detect the CHIKVX-ray diffraction, scanning electron microscopy, UV-vis spectroscopy, and transmission electron microscopy (TEM) were used to characterize the produced nanoparticles. In this work, aptamers are used since they are thought to be a unique and sensitive tool for use in rapid diagnostic methods. Cyclic voltammetry (CV) and linear sweep voltammetry (LSV), which were both validated with a potentiostat, were used to measure the analytical response of the biosensor. The target CHIKV antigen was hybridized with using the aptamer-modified electrode as a signal modulation platform, and its presence was determined by a decline in the current produced by its interaction with an anionic mediator, Methylene Blue (MB). Additionally, a detection limit of 1ng/ml and a broad linear range of 1ng/ml-10µg/ml for the CHIKV antigen were reported.Keywords: biosensors, ePAD, arboviral infections, point of care
Procedia PDF Downloads 1011322 Exploring Management of the Fuzzy Front End of Innovation in a Product Driven Startup Company
Authors: Dmitry K. Shaytan, Georgy D. Laptev
Abstract:
In our research we aimed to test a managerial approach for the fuzzy front end (FFE) of innovation by creating controlled experiment/ business case in a breakthrough innovation development. The experiment was in the sport industry and covered all aspects of the customer discovery stage from ideation to prototyping followed by patent application. In the paper we describe and analyze mile stones, tasks, management challenges, decisions made to create the break through innovation, evaluate overall managerial efficiency that was at the considered FFE stage. We set managerial outcome of the FFE stage as a valid product concept in hand. In our paper we introduce hypothetical construct “Q-factor” that helps us in the experiment to distinguish quality of FFE outcomes. The experiment simulated for entrepreneur the FFE of innovation and put on his shoulders responsibility for the outcome of valid product concept. While developing managerial approach to reach the outcome there was a decision to look on product concept from the cognitive psychology and cognitive science point of view. This view helped us to develop the profile of a person whose projection (mental representation) of a new product could optimize for a manager or entrepreneur FFE activities. In the experiment this profile was tested to develop breakthrough innovation for swimmers. Following the managerial approach the product concept was created to help swimmers to feel/sense water. The working prototype was developed to estimate the product concept validity and value added effect for customers. Based on feedback from coachers and swimmers there were strong positive effect that gave high value for customers, and for the experiment – the valid product concept being developed by proposed managerial approach for the FFE. In conclusions there is a suggestion of managerial approach that was derived from experiment.Keywords: concept development, concept testing, customer discovery, entrepreneurship, entrepreneurial management, idea generation, idea screening, startup management
Procedia PDF Downloads 4461321 Design and Optimization of Spoke Rotor Type Brushless Direct Current Motor for Electric Vehicles Using Different Flux Barriers
Authors: Ismail Kurt, Necibe Fusun Oyman Serteller
Abstract:
Today, with the reduction in semiconductor system costs, Brushless Direct Current (BLDC) motors have become widely preferred. Based on rotor architecture, BLDC structures are divided into internal permanent magnet (IPM) and surface permanent magnet (SPM). However, permanent magnet (PM) motors in electric vehicles (EVs) are still predominantly based on interior permanent magnet (IPM) motors, as the rotors do not require sleeves, the PMs are better protected by the rotor cores, and the air-gap lengths can be much smaller. This study discusses the IPM rotor structure in detail, highlighting its higher torque levels, reluctance torque, wide speed range operation, and production advantages. IPM rotor structures are particularly preferred in EVs due to their high-speed capabilities, torque density and field weakening (FW) features. In FW applications, the motor becomes more suitable for operation at torques lower than the rated torque but at speeds above the rated speed. Although V-type and triangular IPM rotor structures are generally preferred in EV applications, the spoke-type rotor structure offers distinct advantages, making it a competitive option for these systems. The flux barriers in the rotor significantly affect motor performance, providing notable benefits in both motor efficiency and cost. This study utilizes ANSYS/Maxwell simulation software to analyze the spoke-type IPM motor and examine its key design parameters. Through analytical and 2D analysis, preliminary motor design and parameter optimization have been carried out. During the parameter optimization phase, torque ripple a common issue, especially for IPM motors has been investigated, along with the associated changes in motor parameters.Keywords: electric vehicle, field weakening, flux barrier, spoke rotor.
Procedia PDF Downloads 121320 An Overview of Domain Models of Urban Quantitative Analysis
Authors: Mohan Li
Abstract:
Nowadays, intelligent research technology is more and more important than traditional research methods in urban research work, and this proportion will greatly increase in the next few decades. Frequently such analyzing work cannot be carried without some software engineering knowledge. And here, domain models of urban research will be necessary when applying software engineering knowledge to urban work. In many urban plan practice projects, making rational models, feeding reliable data, and providing enough computation all make indispensable assistance in producing good urban planning. During the whole work process, domain models can optimize workflow design. At present, human beings have entered the era of big data. The amount of digital data generated by cities every day will increase at an exponential rate, and new data forms are constantly emerging. How to select a suitable data set from the massive amount of data, manage and process it has become an ability that more and more planners and urban researchers need to possess. This paper summarizes and makes predictions of the emergence of technologies and technological iterations that may affect urban research in the future, discover urban problems, and implement targeted sustainable urban strategies. They are summarized into seven major domain models. They are urban and rural regional domain model, urban ecological domain model, urban industry domain model, development dynamic domain model, urban social and cultural domain model, urban traffic domain model, and urban space domain model. These seven domain models can be used to guide the construction of systematic urban research topics and help researchers organize a series of intelligent analytical tools, such as Python, R, GIS, etc. These seven models make full use of quantitative spatial analysis, machine learning, and other technologies to achieve higher efficiency and accuracy in urban research, assisting people in making reasonable decisions.Keywords: big data, domain model, urban planning, urban quantitative analysis, machine learning, workflow design
Procedia PDF Downloads 1781319 Monitoring and Management of Aquatic Macroinvertebrates for Determining the Level of Water Pollution Catchment Basin of Debed River, Armenia
Authors: Inga Badasyan
Abstract:
Every year we do monitoring of water pollution of catchment basin of Debed River. Next, the Ministry of Nature Protection does modeling programme. Finely, we are managing the impact of water pollution in Debed river. Ecosystem technologies efficiency performance were estimated based on the physical, chemical, and macrobiological analyses of water on regular base between 2012 to 2015. Algae community composition was determined to assess the ecological status of Debed river, while vegetation was determined to assess biodiversity. Last time, experts werespeaking about global warming, which is having bad impact on the surface water, freshwater, etc. As, we know that global warming is caused by the current high levels of carbon dioxide in the water. Geochemical modelling is increasingly playing an important role in various areas of hydro sciences and earth sciences. Geochemical modelling of highly concentrated aqueous solutions represents an important topic in the study of many environments such as evaporation ponds, groundwater and soils in arid and semi-arid zones, costal aquifers, etc. The sampling time is important for benthic macroinvertebrates, for that reason we have chosen in the spring (abundant flow of the river, the beginning of the vegetation season) and autumn (the flow of river is scarce). The macroinvertebrates are good indicator for a chromic pollution and aquatic ecosystems. Results of our earlier investigations in the Debed river reservoirs clearly show that management problem of ecosystem reservoirs is topical. Research results can be applied to studies of monitoring water quality in the rivers and allow for rate changes and to predict possible future changes in the nature of the lake.Keywords: ecohydrological monitoring, flood risk management, global warming, aquatic macroinvertebrates
Procedia PDF Downloads 2891318 Growth Performance Of fresh Water Microalgae Chlorella sp. Exposed to Carbon Dioxide
Authors: Titin Handayani, Adi Mulyanto, Fajar Eko Priyanto
Abstract:
It is generally recognized, that algae could be an interesting option for reducing CO₂ emissions. Based on light and CO₂, algae can be used for the production various economically interesting products. Current algae cultivation techniques, however, still present a number of limitations. Efficient feeding of CO₂, especially on a large scale, is one of them. Current methods for CO₂ feeding to algae cultures rely on the sparging pure CO₂ or directly from flue gas. The limiting factor in this system is the solubility of CO₂ in water, which demands a considerable amount of energy for an effective gas to liquid transfer and leads to losses to the atmosphere. Due to the current ineffective methods for CO₂ introduction into algae ponds very large surface areas would be required for enough ponds to capture a considerable amount of the CO₂. The purpose of this study is to assess technology to capture carbon dioxide (CO₂) emissions generated by industry by utilizing of microalgae Chlorella sp. The microalgae were cultivated in a bioreactor culture pond raceway type. The result is expected to be useful in mitigating the effects of greenhouse gases in reducing the CO₂ emissions. The research activities include: (1) Characterization of boiler flue gas, (2) Operation of culture pond, (3) Sampling and sample analysis. The results of this study showed that the initial assessment absorption of the flue gas by microalgae using 1000 L raceway pond completed by heat exchanger were quite promising. The transfer of CO₂ into the pond culture system was run well. This identified from the success of cooling the boiler flue gas from the temperature of about 200 °C to below ambient temperature. Except for the temperature, the gas bubbles into the culture media were quite fine. Therefore, the contact between the gas and the media was well performed. The efficiency of CO₂ absorption by Chlorella sp reached 6.68 % with an average CO₂ loading of 0.29 g/L/day.Keywords: Chlorella sp., CO2 emission, heat exchange, microalgae, milk industry, raceway pond
Procedia PDF Downloads 2181317 Two-Dimensional Van-Der Waals Heterostructure for Highly Energy-Efficient Field-Free Deterministic Spin-Orbit Torque Switching at Room Temperature
Authors: Pradeep Raj Sharma, Bogeun Jang, Jongill Hong
Abstract:
Spin-orbit torque (SOT) is an efficient approach for manipulating the magnetization of ferromagnetic materials (FMs), providing improved device performance, better compatibility, and ultra-fast switching with lower power consumption compared to spin-transfer torque (STT). Among the various materials and structural designs, two-dimensional (2D) van-der Waals (vdW) layered materials and their heterostructures have been demonstrated as highly scalable and promising device architecture for SOT. In particular, a bilayer heterostructure consisting of fully 2D-vdW-FM, non-magnetic material (NM) offers a potential platform for controlling the magnetization using SOT because of the advantages of being easy to scale and less energy to switch. Here, we report filed-free deterministic switching driven by SOT at room temperature, integrating perpendicularly magnetized 2D-vdW material Fe₃GaTe₂ (FGaT) and NM WTe₂. Pulse current-induced magnetization switching with an ultra-low current density of about 6.5×10⁵ A/cm², yielding a SOT efficiency close to double-digits at 300 K, is reported. These values are two orders of magnitude higher than those observed in conventional heavy metal (HM) based SOT and exceed those reported with 2D-vdW layered materials. WTe₂, a topological semimetal possessing strong SOC and high spin Hall angle, can induce significant spin accumulation with negligible spin loss across the transparent 2D bilayer heterointerface. This promising device architecture enables highly compatible, energy-efficient, non-volatile memory and lays the foundation for designing efficient, flexible, and miniaturized spintronic devices.Keywords: spintronics, spin-orbit torque, spin Hall effect, spin Hall angle, topological semimetal, perpendicular magnetic anisotropy
Procedia PDF Downloads 141316 Bias Minimization in Construction Project Dispute Resolution
Authors: Keyao Li, Sai On Cheung
Abstract:
Incorporation of alternative dispute resolution (ADR) mechanism has been the main feature of current trend of construction project dispute resolution (CPDR). ADR approaches have been identified as efficient mechanisms and are suitable alternatives to litigation and arbitration. Moreover, the use of ADR in this multi-tiered dispute resolution process often leads to repeated evaluations of a same dispute. Multi-tiered CPDR may become a breeding ground for cognitive biases. When completed knowledge is not available at the early tier of construction dispute resolution, disputing parties may form preconception of the dispute matter or the counterpart. This preconception would influence their information processing in the subsequent tier. Disputing parties tend to search and interpret further information in a self-defensive way to confirm their early positions. Their imbalanced information collection would boost their confidence in the held assessments. Their attitudes would be hardened and difficult to compromise. The occurrence of cognitive bias, therefore, impedes efficient dispute settlement. This study aims to explore ways to minimize bias in CPDR. Based on a comprehensive literature review, three types of bias minimizing approaches were collected: strategy-based, attitude-based and process-based. These approaches were further operationalized into bias minimizing measures. To verify the usefulness and practicability of these bias minimizing measures, semi-structured interviews were conducted with ten CPDR third party neutral professionals. All of the interviewees have at least twenty years of experience in facilitating settlement of construction dispute. The usefulness, as well as the implications of the bias minimizing measures, were validated and suggested by these experts. There are few studies on cognitive bias in construction management in general and in CPDR in particular. This study would be the first of its type to enhance the efficiency of construction dispute resolution by highlighting strategies to minimize the biases therein.Keywords: bias, construction project dispute resolution, minimization, multi-tiered, semi-structured interview
Procedia PDF Downloads 1871315 Micropropagation and in vitro Conservation via Slow Growth Techniques of Prunus webbii (Spach) Vierh: An Endangered Plant Species in Albania
Authors: Valbona Sota, Efigjeni Kongjika
Abstract:
Wild almond is a woody species, which is difficult to propagate either generatively by seed or by vegetative methods (grafting or cuttings) and also considered as Endangered (EN) in Albania based on IUCN criteria. As a wild relative of cultivated fruit trees, this species represents a source of genetic variability and can be very important in breeding programs and cultivation. For this reason, it would be of interest to use an effective method of in vitro mid-term conservation, which involves strategies to slow plant growth through physicochemical alterations of in vitro growth conditions. Multiplication of wild almond was carried out using zygotic embryos, as primary explants, with the purpose to develop a successful propagation protocol. Results showed that zygotic embryos can proliferate through direct or indirect organogenesis. During subculture, stage was obtained a great number of new plantlets identical to mother plants derived from the zygotic embryos. All in vitro plantlets obtained from subcultures underwent in vitro conservation by minimal growth in low temperature (4ºC) and darkness. The efficiency of this technique was evaluated for 3, 6, and 10 months of conservation period. Maintenance in these conditions reduced micro cuttings growth. Survival and regeneration rates for each period were evaluated and resulted that the maximal time of conservation without subculture on 4ºC was 10 months, but survival and regeneration rates were significantly reduced, specifically 15.6% and 7.6%. An optimal period of conservation in these conditions can be considered the 5-6 months storage, which can lead to 60-50% of survival and regeneration rates. This protocol may be beneficial for mass propagation, mid-term conservation, and for genetic manipulation of wild almond.Keywords: micropropagation, minimal growth, storage, wild almond
Procedia PDF Downloads 1291314 Design of Solar Charge Controller and Power Converter with the Multisim
Authors: Sohal Latif
Abstract:
Solar power is in the form of photovoltaic, also known as PV, which is a form of renewable energy that applies solar panels in producing electricity from the sun. It has a vital role in fulfilling the present need for clean and renewable energy to get rid of conventional and non-renewable energy sources that emit high levels of greenhouse gases. Solar energy is embraced because of its availability, easy accessibility, and effectiveness in the provision of power, chiefly in country areas. In solar charging, device charge entails a change of light power into electricity using photovoltaic or PV panels, which supply direct current electric power or DC. Here, the solar charge controller has a very crucial role to play regarding the voltages and the currents coming from the solar panels to take up the changing needs of a battery without overcharging the same. Certain devices, such as inverters, are required to transform the DC power produced by the solar panels into an AC to serve the normal electrical appliances and the current power network. This project was initiated for a project of a solar charge controller and power converter with the MULTISIM. The formation of this project begins with a literature survey to obtain basic knowledge about power converters, charge controllers, and photovoltaic systems. Fundamentals of the operation of solar panels include the process by which light is converted into electricity and a comparison of PWM and MPPT chargers with controllers. Knowledge of rectifiers is built to help achieve AC-to-DC and DC-AC change. Choosing a resistor, capacitance, MOSFET, and OP-AMP is done by the need of the system. The circuit diagrams of converters and charge controllers are designed using the Multisim program. Pulse width modulation, Bubba oscillator circuit, and inverter circuits are modeled and simulated. In the subsequent steps, the analysis of the simulation outcomes indicates the efficiency of the intended converter systems. The various outputs from the different configurations, with the transformer incorporated as well as without it, are then monitored for effective power conversion as well as power regulation.Keywords: solar charge controller, MULTISIM, converter, inverter
Procedia PDF Downloads 251313 Conduction Transfer Functions for the Calculation of Heat Demands in Heavyweight Facade Systems
Authors: Mergim Gasia, Bojan Milovanovica, Sanjin Gumbarevic
Abstract:
Better energy performance of the building envelope is one of the most important aspects of energy savings if the goals set by the European Union are to be achieved in the future. Dynamic heat transfer simulations are being used for the calculation of building energy consumption because they give more realistic energy demands compared to the stationary calculations that do not take the building’s thermal mass into account. Software used for these dynamic simulation use methods that are based on the analytical models since numerical models are insufficient for longer periods. The analytical models used in this research fall in the category of the conduction transfer functions (CTFs). Two methods for calculating the CTFs covered by this research are the Laplace method and the State-Space method. The literature review showed that the main disadvantage of these methods is that they are inadequate for heavyweight façade elements and shorter time periods used for the calculation. The algorithms for both the Laplace and State-Space methods are implemented in Mathematica, and the results are compared to the results from EnergyPlus and TRNSYS since these software use similar algorithms for the calculation of the building’s energy demand. This research aims to check the efficiency of the Laplace and the State-Space method for calculating the building’s energy demand for heavyweight building elements and shorter sampling time, and it also gives the means for the improvement of the algorithms used by these methods. As the reference point for the boundary heat flux density, the finite difference method (FDM) is used. Even though the dynamic heat transfer simulations are superior to the calculation based on the stationary boundary conditions, they have their limitations and will give unsatisfactory results if not properly used.Keywords: Laplace method, state-space method, conduction transfer functions, finite difference method
Procedia PDF Downloads 1341312 Hepatoprotective Assessment of L-Ascorbate 1-(2-Hydroxyethyl)-4,6-Dimethyl-1, 2-Dihydropyrimidine-2-on in Toxic Liver Damage Test
Authors: Vladimir Zobov, Nail Nazarov, Alexandra Vyshtakalyuk, Vyacheslav Semenov, Irina Galyametdinova, Vladimir Reznik
Abstract:
The aim of this study was to investigate hepatoprotective properties of the Xymedon derivative L-ascorbate 1- (2-hydroxyethyl)-4,6-dimethyl-1,2-dihydropyrimidine-2-one (XD), which exhibits high efficiency as actoprotector. The study was carried out on 68 male albino rats weighing 250-400 g using preventive exposure to the test preparation. Effectiveness of XD win comparison with effectiveness of Xymedon (original substance) after administration of the compounds in identical doses. Maximum dose was 20 mg/kg. The animals orally received Xymedon or its derivative in doses of 10 and 20 mg/kg over 4 days. In 1-1.5 h after drug administration, CCl4 in vegetable oil (1:1) in a dose of 2 ml/kg. Controls received CCl4 but without hepatoprotectors. Intact control group consisted of rats, not receiving CCl4 or other compounds. The next day after the last administration of CCl4 and compounds under study animals were dehematized under ether anesthesia, blood and liver samples were taken for biochemical and histological analysis. Xymedon and XD administered according to the preventice scheme, exerted hepatoprotective effects: Xymedon — in the dose of 20 mg/kg, XD — in doses of 10 and 20 mg/kg. The drugs under study had different effects on liver condition, affected by induction with CCl4. Xymedon had a more pronounced effect both on the ALT level, which can be elevated not only due to destructive changes in hepatocytes, but also as a cholestasis manifestation, and on the serum total protein level, which reflects protein synthesis in liver. XD had a more pronounced effect on AST level, which is one of the markers of hepatocyte damage. Lower effective dose of XD — 10 mg/kg, compared to Xymedon effective according to, and its pronounced effect on AST, the hepatocyte cytolysis marker, is indicative of its higher preventive effectiveness, compared to Xymedon. This work was performed with the financial support of Russian Science Foundation (grant No: 14-50-00014).Keywords: hepatoprotectors, pyrimidine derivatives, toxic liver damage, xymedon
Procedia PDF Downloads 306