Search results for: artificial potential function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16784

Search results for: artificial potential function

13484 Innovations in the Lithium Chain Value

Authors: Fiúza A., Góis J. Leite M., Braga H., Lima A., Jorge P., Moutela P., Martins L., Futuro A.

Abstract:

Lepidolite is an important lithium mineral that, to the author’s best knowledge, has not been used to produce lithium hydroxide, necessary for energy conversion to electric vehicles. Alkaline leaching of lithium concentrates allows the establishment of a production diagram avoiding most of the environmental drawbacks that are associated with the usage of acid reagents. The tested processes involve a pretreatment by digestion at high temperatures with additives, followed by leaching at hot atmospheric pressure. The solutions obtained must be compatible with solutions from the leaching of spodumene concentrates, allowing the development of a common treatment diagram, an important accomplishment for the feasible exploitation of Portuguese resources. Statistical programming and interpretation techniques are used to minimize the laboratory effort required by conventional approaches and also allow phenomenological comprehension.

Keywords: artificial intelligence, tailings free process, ferroelectric electrolyte battery, life cycle assessment

Procedia PDF Downloads 105
13483 The New Insight about Interspecies Transmission of Iranian H9N2 Influenza Viruses from Avian to Human

Authors: Masoud Soltanialvar, Ali Bagherpour

Abstract:

Documented cases of human infection with H9N2 avian influenza viruses, first detected in 1999 in Hong Kong and China, indicate that these viruses can be directly transmitted from birds to humans. In this study, we characterized the mutation in the Hemagglutinin (HA) genes and proteins that correlates with a shift in affinity of the Hemagglutinin (HA) protein from the “avian” type sialic receptors to the “human” type in 10 Iranian isolates. We delineated the genomes and receptor binding profile of HA gene of some field isolates and established their phylogenetic relationship to the other Asian H9N2 sub lineages. A total of 1200 tissue samples collected from 40 farms located in various states of Iran during 2008 – 2010 as part of a program to monitor Avian Influenza Viruses (AIV) infection. To determine the genetic relationship of Iranian viruses, the Hemagglutinin (HA) genes from ten isolates were amplified and sequenced (by RT-PCR method). Nucleotide sequences (orf) of the (HA) genes were used for phylogenetic tree construction. Deduced amino acid sequences showed the presence of L226 (234 in H9 numbering) in all ten Iranian isolates which indicates a preference to binding of α (2–6) sialic acid receptors, so these Iranian H9N2 viruses have the potential to infect human beings. These isolates showed high degree of homology with 2 human H9N2 isolates A/HK/1073/99, A/HK/1074/99. Phylogenetic analysis of showed that all the HA genes of the Iranian H9N2 viruses fall into a single group within a G1-like sublineage which had contributed as donor of six internal genes to H5N1 highly pathogenic avian influenza. The results of this study indicated that all Iranian viruses have the potential to emerge as highly pathogenic influenza virus, and considering the homology of these isolates with human H9N2 strains, it seems that the potential of these avian influenza isolates to infect human should not be overlooked.

Keywords: influenza virus, hemagglutinin, neuraminidase, Iran

Procedia PDF Downloads 433
13482 Characterization of Nanoemulsion Incorporating Crude Cocoa Polyphenol

Authors: Suzannah Sharif, Aznie Aida Ahmad, Maznah Ismail

Abstract:

Cocoa bean is the raw material for products such as cocoa powder and chocolate. Cocoa bean contains polyphenol which has been shown in several clinical studies to confer beneficial health effects. However studies showed that cocoa polyphenol absorption in the human intestinal tracts are very low. Therefore nanoemulsion may be one way to increase the bioavailability of cocoa polyphenol. This study aim to characterize nanoemulsion incorporating crude cocoa polyphenol produced using high energy technique. Cocoa polyphenol was extracted from fresh freeze-dried cocoa beans from Malaysia. The particle distribution, particle size, and zeta potential were determined. The emulsion was also analysed using transmission electron microscope to visualize the particles. Solubilization study was conducted by titrating the nanoemulsion into distilled water or 1% surfactant solution. Result showed that the nanoemulsion contains particle which have narrow size distribution. The particles size average at 112nm with zeta potential of -45mV. The nanoemulsions behave differently in distilled water and surfactant solution.

Keywords: cocoa, nanoemulsion, cocoa polyphenol, solubilisation study

Procedia PDF Downloads 453
13481 Comparative Study between Classical P-Q Method and Modern Fuzzy Controller Method to Improve the Power Quality of an Electrical Network

Authors: A. Morsli, A. Tlemçani, N. Ould Cherchali, M. S. Boucherit

Abstract:

This article presents two methods for the compensation of harmonics generated by a nonlinear load. The first is the classic method P-Q. The second is the controller by modern method of artificial intelligence specifically fuzzy logic. Both methods are applied to an Active Power Filter shunt (APFs) based on a three-phase voltage converter at five levels NPC topology. In calculating the harmonic currents of reference, we use the algorithm P-Q and pulse generation, we use the intersective PWM. For flexibility and dynamics, we use fuzzy logic. The results give us clear that the rate of Harmonic Distortion issued by fuzzy logic is better than P-Q.

Keywords: fuzzy logic controller, P-Q method, pulse width modulation (PWM), shunt active power filter (sAPF), total harmonic distortion (THD)

Procedia PDF Downloads 533
13480 Evotrader: Bitcoin Trading Using Evolutionary Algorithms on Technical Analysis and Social Sentiment Data

Authors: Martin Pellon Consunji

Abstract:

Due to the rise in popularity of Bitcoin and other crypto assets as a store of wealth and speculative investment, there is an ever-growing demand for automated trading tools, such as bots, in order to gain an advantage over the market. Traditionally, trading in the stock market was done by professionals with years of training who understood patterns and exploited market opportunities in order to gain a profit. However, nowadays a larger portion of market participants are at minimum aided by market-data processing bots, which can generally generate more stable signals than the average human trader. The rise in trading bot usage can be accredited to the inherent advantages that bots have over humans in terms of processing large amounts of data, lack of emotions of fear or greed, and predicting market prices using past data and artificial intelligence, hence a growing number of approaches have been brought forward to tackle this task. However, the general limitation of these approaches can still be broken down to the fact that limited historical data doesn’t always determine the future, and that a lot of market participants are still human emotion-driven traders. Moreover, developing markets such as those of the cryptocurrency space have even less historical data to interpret than most other well-established markets. Due to this, some human traders have gone back to the tried-and-tested traditional technical analysis tools for exploiting market patterns and simplifying the broader spectrum of data that is involved in making market predictions. This paper proposes a method which uses neuro evolution techniques on both sentimental data and, the more traditionally human-consumed, technical analysis data in order to gain a more accurate forecast of future market behavior and account for the way both automated bots and human traders affect the market prices of Bitcoin and other cryptocurrencies. This study’s approach uses evolutionary algorithms to automatically develop increasingly improved populations of bots which, by using the latest inflows of market analysis and sentimental data, evolve to efficiently predict future market price movements. The effectiveness of the approach is validated by testing the system in a simulated historical trading scenario, a real Bitcoin market live trading scenario, and testing its robustness in other cryptocurrency and stock market scenarios. Experimental results during a 30-day period show that this method outperformed the buy and hold strategy by over 260% in terms of net profits, even when taking into consideration standard trading fees.

Keywords: neuro-evolution, Bitcoin, trading bots, artificial neural networks, technical analysis, evolutionary algorithms

Procedia PDF Downloads 106
13479 The Decision-Making Mechanisms of Tax Regulations

Authors: Nino Pailodze, Malkhaz Sulashvili, Vladimer Kekenadze, Tea Khutsishvili, Irma Makharashvili, Aleksandre Kekenadze

Abstract:

In the nearest future among the important problems which Georgia has solve the most important is economic stability, that bases on fiscal policy and the proper definition of the its directions. The main source of the Budget revenue is the national income. The State uses taxes, loans and emission in order to create national income, were the principal weapon are taxes. As well as fiscal function of the fulfillment of the budget, tax systems successfully implement economic and social development and the regulatory functions of foreign economic relations. A tax is a mandatory, unconditional monetary payment to the budget made by a taxpayer in accordance with this Code, based on the necessary, nonequivalent and gratuitous character of the payment. Taxes shall be national and local. National taxes shall be the taxes provided for under this Code, the payment of which is mandatory across the whole territory of Georgia. Local taxes shall be the taxes provided for under this Code, introduced by normative acts of local self-government representative authorities (within marginal rates), the payment of which is mandatory within the territory of the relevant self-governing unit. National taxes have the leading role in tax systems, but also the local taxes have an importance role in tax systems. Exactly in the means of local taxes, the most part of the budget is formatted. National taxes shall be: income tax, profit tax, value added tax (VAT), excise tax, import duty, property tax shall be a local tax The property tax is one of the significant taxes in Georgia. The paper deals with the taxation mechanism that has been operated in Georgia. The above mention has the great influence in financial accounting. While comparing foreign legislation towards Georgian legislation we discuss the opportunity of using their experience. Also, we suggested recommendations in order to improve the tax system in financial accounting. In addition to accounting, which is regulated according the International Accounting Standards we have tax accounting, which is regulated by the Tax Code, various legal orders / regulations of the Minister of Finance. The rules are controlled by the tax authority, Revenue Service. The tax burden from the tax values are directly related to expenditures of the state from the emergence of the first day. Fiscal policy of the state is as well as expenditure of the state and decisions of taxation. In order to get the best and the most effective mobilization of funds, Government’s primary task is to decide the kind of taxation rules. Tax function is to reveal the substance of the act. Taxes have the following functions: distribution or the fiscal function; Control and regulatory functions. Foreign tax systems evolved in the different economic, political and social conditions influence. The tax systems differ greatly from each other: taxes, their structure, typing means, rates, the different levels of fiscal authority, the tax base, the tax sphere of action, the tax breaks.

Keywords: international accounting standards, financial accounting, tax systems, financial obligations

Procedia PDF Downloads 224
13478 Links and Blocks: the Role of Language in Samuel Beckett’s Selected Plays

Authors: Su-Lien Liao

Abstract:

This article explores the language in the four plays of Samuel Beckett–Waiting for Godot, Endgame, Krapp’s Last Tape, and Footfalls. It considers the way in which Beckett uses language, especially through fragmentation utterances, repetitions, monologues, contradictions, and silence. It discusses the function of language in modern society, in the theater of the absurd, and in the plays. Paradoxically enough, his plays attempts to communicate the incommunicability of language.

Keywords: language, Samuel Beckett, theater of the absurd, foreign language teaching

Procedia PDF Downloads 425
13477 The Potential and Economic Viability Analysis of Grid-Connected Solar PV Power in Kenya

Authors: Remember Samu, Kathy Kiema, Murat Fahrioglu

Abstract:

This present study is aimed at minimizing the dependence on fossil fuels thus reducing greenhouse gas (GHG) emissions and also to curb for the rising energy demands in Kenya. In this analysis, 35 locations were each considered for their techno-economic potential of installation of a 10MW grid-connected PV plant. The sites are scattered across the country but are mostly concentrated in the eastern region and were selected based on their accessibility to the national grid and availability of their meteorological parameters from NASA Solar Energy Dataset. RETScreen software 4.0 version will be employed for the analysis in this present paper. The capacity factor, simple payback, equity payback, the net present value (NPV), annual life cycle savings, energy production cost, net annual greenhouse gas emission reduction and the equivalent barrels of crude oil not consumed are outlined. Energy accounting is performed and compared to the existing grid tariff for an effective feasibility argument of this 10MW grid-connected PV power system.

Keywords: photovoltaics, project viability analysis, PV module, renewable energy

Procedia PDF Downloads 301
13476 Screening of Different Exotic Varieties of Potato through Adaptability Trial for Local Cultivation

Authors: Arslan Shehroz, Muhammad Amjad Ali, Amjad Abbas, Imran Ramzan, Muhammad Zunair Latif

Abstract:

Potato (Solanum tuberosum L.) is the 4th most important food crop of the world after wheat, rice and maize. It is the staple food in many European countries. Being rich in starch (one of the main three food ingredients) and having the highest productivity per unit area, has great potential to address the challenge of the food security. Processed potato is also used as chips and crisps etc as ‘fast food’. There are many biotic and abiotic factors which check the production of potato and become hurdle in achievement production potential of potato. 20 new varieties along with two checks were evaluated. Plant to plant and row to row distances were maintained as 20 cm and 75 cm, respectively. The trial was conducted according to the randomized complete block design with three replications. Normal agronomic and plant protection measures were carried out in the crop. It is revealed from the experiment that exotic variety 171 gave the highest yield of 35.5 t/ha followed by Masai with 31.0 t/ha tuber yield. The check variety Simply Red 24.2 t/ha yield, while the lowest tuber yield (1.5 t/ha) was produced by the exotic variety KWS-06-125. The maximum emergence was shown by the Variety Red Sun (89.7 %). The lowest emergence was shown by the variety Camel (71.7%). Regarding tuber grades, it was noted that the maximum Ration size tubers were produced by the exotic variety Compass (3.7%), whereas 11 varieties did not produce ration size tubers at all. The variety Red Sun produced lowest percentage of small size tubers (12.7%) whereas maximum small size tubers (93.0%) were produced by the variety Jitka. Regarding disease infestation, it was noted that the maximum scab incidence (4.0%) was recorded on the variety Masai, maximum rhizoctonia attack (60.0%) was recorded on the variety Camel and maximum tuber cracking (0.7%) was noted on the variety Vendulla.

Keywords: check variety, potato, potential and yield, trial

Procedia PDF Downloads 366
13475 Central Vascular Function and Relaxibility in Beta-thalassemia Major Patients vs. Sickle Cell Anemia Patients by Abdominal Aorta and Aortic Root Speckle Tracking Echocardiography

Authors: Gehan Hussein, Hala Agha, Rasha Abdelraof, Marina George, Antoine Fakhri

Abstract:

Background: β-Thalassemia major (TM) and sickle cell disease (SCD) are inherited hemoglobin disorders resulting in chronic hemolytic anemia. Cardiovascular involvement is an important cause of morbidity and mortality in these groups of patients. The narrow border is between overt myocardial dysfunction and clinically silent left ventricular (LV) and / or right ventricular (RV) dysfunction in those patients. 3 D Speckle tracking echocardiography (3D STE) is a novel method for the detection of subclinical myocardial involvement. We aimed to study myocardial affection in SCD and TM using 3D STE, comparing it with conventional echocardiography, correlate it with serum ferritin level and lactate dehydrogenase (LDH). Methodology: Thirty SCD and thirty β TM patients, age range 4-18 years, were compared to 30 healthy age and sex matched control group. Cases were subjected to clinical examination, laboratory measurement of hemoglobin level, serum ferritin, and LDH. Transthoracic color Doppler echocardiography, 3D STE, tissue Doppler echocardiography, and aortic speckle tracking were performed. Results: significant reduction in global longitudinal strain (GLS), global circumferential strain (GCS), and global area strain (GAS) in SCD and TM than control (P value <0.001) there was significantly lower aortic speckle tracking in patients with TM and SCD than control (P value< 0.001). LDH was significantly higher in SCD than both TM and control and it correlated significantly positive mitral inflow E, (p value:0.022 and 0.072. r: 0.416 and -0.333 respectively) lateral E/E’ (p value.<0.001and 0.818. r. 0.618 and -0. 044.respectively) and septal E/E’ (p value 0.007 and 0.753& r value 0.485 and -0.060 respectively) in SCD but not TM and significant negative correlation between LDH and aortic root speckle tracking (value 0.681& r. -0.078.). The potential diagnostic accuracy of LDH in predicting vascular dysfunction as represented by aortic root GCS with a sensitivity 74% and aortic root GCS was predictive of LV dysfunction in SCD patients with sensitivity 100% Conclusion: 3D STE LV and RV systolic dysfunction in spite of their normal values by conventional echocardiography. SCD showed significantly lower right ventricular dysfunction and aortic root GCS than TM and control. LDH can be used to screen patients for cardiac dysfunction in SCD, not in TM

Keywords: thalassemia major, sickle cell disease, 3d speckle tracking echocardiography, LDH

Procedia PDF Downloads 152
13474 Sustainable Energy Supply in Social Housing

Authors: Rolf Katzenbach, Frithjof Clauss, Jie Zheng

Abstract:

The final energy use can be divided mainly in four sectors: commercial, industrial, residential, and transportation. The trend in final energy consumption by sector plays as a most straightforward way to provide a wide indication of progress for reducing energy consumption and associated environmental impacts by different end use sectors. According to statistics the average share of end use energy for residential sector in the world was nearly 20% until 2011, in Germany a higher proportion is between 25% and 30%. However, it remains less studied than energy use in other three sectors as well its impacts on climate and environment. The reason for this involves a wide range of fields, including the diversity of residential construction like different housing building design and materials, living or energy using behavioral patterns, climatic condition and variation as well other social obstacles, market trend potential and financial support from government. This paper presents an extensive and in-depth analysis of the manner by which projects researched and operated by authors in the fields of energy efficiency primarily from the perspectives of both technical potential and initiative energy saving consciousness in the residential sectors especially in social housing buildings.

Keywords: energy efficiency, renewable energy, retro-commissioning, social housing, sustainability

Procedia PDF Downloads 433
13473 Absurdity as a Catalyst for Reflection: A Study of Tawfiq Al-Hakim’s The Fate of a Cockroach

Authors: Adaoma Igwedibia, Obetta Emmanuela

Abstract:

The use of absurdity as a catalyst for reflection has gained attention in various domains, including philosophy, literature, and psychology. Absurdity, characterised by its inherent contradiction and irrationality, has been considered a potent tool for stimulating reflection and generating meaningful insights. However, despite its conceptual appeal, a comprehensive understanding of the effectiveness and potential limitations of absurdity in this context remains insufficiently explored. This paper aims to address this gap in knowledge by critically examining the role of absurdity in stimulating reflection and uncovering its precise mechanisms for generating meaningful insights. By reviewing relevant literature and theories, we seek to shed light on the factors that influence the effectiveness of absurdity as a catalyst for reflection and explore its potential limitations. Furthermore, this study intends to provide practical implications for the utilisation of absurdity in various fields, such as education, creativity, and personal development. Through a thorough investigation of existing research and the identification of areas for further exploration, this paper aims to contribute to a more comprehensive understanding of the role of absurdity in stimulating reflection and generating meaningful insights.

Keywords: absurdity, catalyst, reflection, effectiveness

Procedia PDF Downloads 60
13472 Removal of Per- and Polyfluoroalkyl Substances (PFASs) Contaminants from the Aqueous Phase Using Chitosan Beads

Authors: Rahim Shahrokhi, Junboum Park

Abstract:

Per- and Polyfluoroalkyl Substances (PFASs) are environmentally persistent halogenated hydrocarbons that have been widely used in many industrial and commercial applications. Recently, contaminating the soil and groundwater due to the ubiquity of PFAS in environments has raised great concern. Adsorption technology is one of the most promising methods for PFAS removal. Chitosan is a biopolymer substance with abundant amine and hydroxyl functional groups, which render it a good adsorbent. This study has tried to enhance the adsorption capacity of chitosan by grafting more amine functional groups on its surface for the removal of two long (PFOA and PFOS) and two short-chain (PFBA, PFBS) PFAS substances from the aqueous phase. A series of batch adsorption tests have been performed to evaluate the adsorption capacity of the used sorbent. Also, the sorbent was analyzed by SEM, FT-IR, zeta potential, and XRD tests. The results demonstrated that both chitosan beads have good potential for adsorbing short and long-chain PFAS from the aqueous phase.

Keywords: PFAS, chitosan beads, adsorption, grafted chitosan

Procedia PDF Downloads 45
13471 Rheological Study of Natural Sediments: Application in Filling of Estuaries

Authors: S. Serhal, Y. Melinge, D. Rangeard, F. Hage Chehadeh

Abstract:

Filling of estuaries is an international problem that can cause economic and environmental damage. This work aims the study of the rheological structuring mechanisms of natural sedimentary liquid-solid mixture in estuaries in order to better understand their filling. The estuary of the Rance river, located in Brittany, France is particularly targeted by the study. The aim is to provide answers on the rheological behavior of natural sediments by detecting structural factors influencing the rheological parameters. So we can better understand the fillings estuarine areas and especially consider sustainable solutions of ‘cleansing’ of these areas. The sediments were collected from the trap of Lyvet in Rance estuary. This trap was created by the association COEUR (Comité Opérationnel des Elus et Usagers de la Rance) in 1996 in order to facilitate the cleansing of the estuary. It creates a privileged area for the deposition of sediments and consequently makes the cleansing of the estuary easier. We began our work with a preliminary study to establish the trend of the rheological behavior of the suspensions and to specify the dormant phase which precedes the beginning of the biochemical reactivity of the suspensions. Then we highlight the visco-plastic character at younger age using the Kinexus rheometer, plate-plate geometry. This rheological behavior of suspensions is represented by the Bingham model using dynamic yield stress and viscosity which can be a function of volume fraction, granular extent, and chemical reactivity. The evolution of the viscosity as a function of the solid volume fraction is modeled by the Krieger-Dougherty model. On the other hand, the analysis of the dynamic yield stress showed a fairly functional link with the solid volume fraction.

Keywords: estuaries, rheological behavior, sediments, Kinexus rheometer, Bingham model, viscosity, yield stress

Procedia PDF Downloads 143
13470 Use of Satellite Imaging to Understand Earth’s Surface Features: A Roadmap

Authors: Sabri Serkan Gulluoglu

Abstract:

It is possible with Geographic Information Systems (GIS) that the information about all natural and artificial resources on the earth is obtained taking advantage of satellite images are obtained by remote sensing techniques. However, determination of unknown sources, mapping of the distribution and efficient evaluation of resources are defined may not be possible with the original image. For this reasons, some process steps are needed like transformation, pre-processing, image enhancement and classification to provide the most accurate assessment numerically and visually. Many studies which present the phases of obtaining and processing of the satellite images have examined in the literature study. The research showed that the determination of the process steps may be followed at this subject with the existence of a common whole may provide to progress the process rapidly for the necessary and possible studies which will be.

Keywords: remote sensing, satellite imaging, gis, computer science, information

Procedia PDF Downloads 301
13469 Budgetary Performance Model for Managing Pavement Maintenance

Authors: Vivek Hokam, Vishrut Landge

Abstract:

An ideal maintenance program for an industrial road network is one that would maintain all sections at a sufficiently high level of functional and structural conditions. However, due to various constraints such as budget, manpower and equipment, it is not possible to carry out maintenance on all the needy industrial road sections within a given planning period. A rational and systematic priority scheme needs to be employed to select and schedule industrial road sections for maintenance. Priority analysis is a multi-criteria process that determines the best ranking list of sections for maintenance based on several factors. In priority setting, difficult decisions are required to be made for selection of sections for maintenance. It is more important to repair a section with poor functional conditions which includes uncomfortable ride etc. or poor structural conditions i.e. sections those are in danger of becoming structurally unsound. It would seem therefore that any rational priority setting approach must consider the relative importance of functional and structural condition of the section. The maintenance priority index and pavement performance models tend to focus mainly on the pavement condition, traffic criteria etc. There is a need to develop the model which is suitably used with respect to limited budget provisions for maintenance of pavement. Linear programming is one of the most popular and widely used quantitative techniques. A linear programming model provides an efficient method for determining an optimal decision chosen from a large number of possible decisions. The optimum decision is one that meets a specified objective of management, subject to various constraints and restrictions. The objective is mainly minimization of maintenance cost of roads in industrial area. In order to determine the objective function for analysis of distress model it is necessary to fix the realistic data into a formulation. Each type of repair is to be quantified in a number of stretches by considering 1000 m as one stretch. A stretch considered under study is having 3750 m length. The quantity has to be put into an objective function for maximizing the number of repairs in a stretch related to quantity. The distress observed in this stretch are potholes, surface cracks, rutting and ravelling. The distress data is measured manually by observing each distress level on a stretch of 1000 m. The maintenance and rehabilitation measured that are followed currently are based on subjective judgments. Hence, there is a need to adopt a scientific approach in order to effectively use the limited resources. It is also necessary to determine the pavement performance and deterioration prediction relationship with more accurate and economic benefits of road networks with respect to vehicle operating cost. The infrastructure of road network should have best results expected from available funds. In this paper objective function for distress model is determined by linear programming and deterioration model considering overloading is discussed.

Keywords: budget, maintenance, deterioration, priority

Procedia PDF Downloads 187
13468 Towards the Need of Resilient Design and Its Assessment in South China

Authors: Alan Lai, Wilson Yik

Abstract:

With rapid urbanization, there has been a dramatic increase in global urban population in Asia and over half of population in Asia will live in urban regions in the near future. Facing with increasing exposure to climate-related stresses and shocks, most of the Asian cities will very likely to experience more frequent heat waves and flooding with rising sea levels, particularly the coastal cities will grapple for intense typhoons and storm surges. These climate changes have severe impacts in urban areas at the costs of infrastructure and population, for example, human health, wellbeing and high risks of dengue fever, malaria and diarrheal disease. With the increasing prominence of adaptation to climate changes, there have been changes in corresponding policies. Smaller cities have greater potentials for integrating the concept of resilience into their infrastructure as well as keeping pace with their rapid growths in population. It is therefore important to explore the potentials of Asian cities adapting to climate change and the opportunities of building climate resilience in urban planning and building design. Furthermore, previous studies have mainly attempted at exploiting the potential of resilience on a macro-level within urban planning rather than that on micro-level within the context of individual building. The resilience of individual building as a research field has not yet been much explored. Nonetheless, recent studies define that the resilience of an individual building is the one which is able to respond to physical damage and recover from such damage in a quickly and cost-effectively manner, while maintain its primary functions. There is also a need to develop an assessment tool to evaluate the resilience on building scale which is still largely uninvestigated as it should be regarded as a basic function of a building. Due to the lack of literature reporting metric for assessing building resilience with sustainability, the research will be designed as a case study to provide insight into the issue. The aim of this research project is to encourage and assist in developing neighborhood climate resilience design strategies for Hong Kong so as to bridge the gap between difference scales and that between theory and practice.

Keywords: resilience cities, building resilience, resilient buildings and infrastructure, climate resilience, hot and humid southeast area, high-density cities

Procedia PDF Downloads 153
13467 Estimating Marine Tidal Power Potential in Kenya

Authors: Lucy Patricia Onundo, Wilfred Njoroge Mwema

Abstract:

The rapidly diminishing fossil fuel reserves, their exorbitant cost and the increasingly apparent negative effect of fossil fuels to climate changes is a wake-up call to explore renewable energy. Wind, bio-fuel and solar power have already become staples of Kenyan electricity mix. The potential of electric power generation from marine tidal currents is enormous, with oceans covering more than 70% of the earth. However, attempts to harness marine tidal energy in Kenya, has yet to be studied thoroughly due to its promising, cyclic, reliable and predictable nature and the vast energy contained within it. The high load factors resulting from the fluid properties and the predictable resource characteristics make marine currents particularly attractive for power generation and advantageous when compared to others. Global-level resource assessments and oceanographic literature and data have been compiled in an analysis of the technology-specific requirements for tidal energy technologies and the physical resources. Temporal variations in resource intensity as well as the differences between small-scale applications are considered.

Keywords: tidal power, renewable energy, energy assessment, Kenya

Procedia PDF Downloads 548
13466 Quantifying Product Impacts on Biodiversity: The Product Biodiversity Footprint

Authors: Leveque Benjamin, Rabaud Suzanne, Anest Hugo, Catalan Caroline, Neveux Guillaume

Abstract:

Human products consumption is one of the main drivers of biodiversity loss. However, few pertinent ecological indicators regarding product life cycle impact on species and ecosystems have been built. Life cycle assessment (LCA) methodologies are well under way to conceive standardized methods to assess this impact, by taking already partially into account three of the Millennium Ecosystem Assessment pressures (land use, pollutions, climate change). Coupling LCA and ecological data and methods is an emerging challenge to develop a product biodiversity footprint. This approach was tested on three case studies from food processing, textile, and cosmetic industries. It allowed first to improve the environmental relevance of the Potential Disappeared Fraction of species, end-point indicator typically used in life cycle analysis methods, and second to introduce new indicators on overexploitation and invasive species. This type of footprint is a major step in helping companies to identify their impacts on biodiversity and to propose potential improvements.

Keywords: biodiversity, companies, footprint, life cycle assessment, products

Procedia PDF Downloads 310
13465 CRISPR Technology: A Tool in the Potential Cure for COVID-19 Virus

Authors: Chijindu Okpalaoka, Charles Chinedu Onuselogu

Abstract:

COVID-19, humanity's coronavirus disease caused by SARS-CoV-2, was first detected in late 2019 in Wuhan, China. COVID-19 lacked an established conventional pharmaceutical therapy, and as a result, the outbreak quickly became an epidemic affecting the entire World. Only a qPCR assay is reliable for diagnosing COVID-19. Clustered, regularly interspaced short palindromic repeats (CRISPR) technology is being researched for speedy and specific identification of COVID-19, among other therapeutic techniques. Apart from its therapeutic capabilities, the CRISPR technique is being evaluated to develop antiviral therapies; nevertheless, no CRISPR-based medication has been approved for human use to date. Prophylactic antiviral CRISPR in living being cells, a Cas 13-based approach against coronavirus, has been developed. While this method can be evolved into a treatment approach, it may face substantial obstacles in human clinical trials for licensure. This study discussed the potential applications of CRISPR-based techniques for developing a speedy and accurate feasible treatment alternative for the COVID-19 virus.

Keywords: COVID-19, CRISPR technique, Cas13, SARS-CoV-2, prophylactic antiviral

Procedia PDF Downloads 108
13464 Potential Probiotic Bacteria Isolated from Dairy Products of Saudi Arabia

Authors: Rashad Al-Hindi

Abstract:

The aims of the study were to isolate and identify potential probiotic lactic acid bacteria due to their therapeutic and food preservation importance. Sixty-three suspected lactic acid bacteria (LAB) strains were isolated from thirteen different raw milk and fermented milk product samples of various animal origins manufactured indigenously in the Kingdom of Saudi Arabia using de Man, Rogosa and Sharpe (MRS) agar medium and various incubation conditions. The identification of forty-six selected LAB strains was performed using molecular methods (16S rDNA gene sequencing). The LAB counts in certain samples were higher under microaerobic incubation conditions than under anaerobic conditions. The identified LAB belonged to the following genera: Enterococcus (16 strains), Lactobacillus (9 strains), Weissella (10 strains), Streptococcus (8 strains) and Lactococcus (3 strains), constituting 34.78%, 19.57%, 21.74%, 17.39% and 6.52% of the suspected isolates, respectively. This study noted that the raw milk and traditional fermented milk products of Saudi Arabia, especially stirred yogurt (Laban) made from camel milk, could be rich in LAB. The obtained LAB strains in this study will be tested for their probiotic potentials in another ongoing study.

Keywords: dairy, LAB, probiotic, Saudi Arabia

Procedia PDF Downloads 272
13463 Possible Modulation of FAS and PTP-1B Signaling in Ameliorative Potential of Bombax ceiba against High Fat Diet Induced Obesity

Authors: Paras Gupta, Rohit Goyal, Yamini Chauhan, Pyare Lal Sharma

Abstract:

Background: Bombax ceiba Linn., commonly called as Semal, is used in various gastro-intestinal disturbances. It contains lupeol which inhibits PTP-1B, adipogenesis, TG synthesis and accumulation of lipids in adipocytes and adipokines whereas the flavonoids isolated from B. ceiba has FAS inhibitory activity. The present study was aimed to investigate ameliorative potential of Bombax ceiba to experimental obesity in Wistar rats, and its possible mechanism of action. Methods: Male Wistar albino rats weighing 180–220 g were employed in present study. Experimental obesity was induced by feeding high fat diet for 10 weeks. Methanolic extract of B. ceiba extract 100, 200 and 400 mg/kg and Gemfibrozil 50 mg/kg as standard drug were given orally from 7th to 10th week. Results: Induction with HFD for 10 weeks caused significant (p < 0.05) increase in % body wt, BMI, LEE indices; serum glucose, triglyceride, LDL, VLDL, cholesterol, free fatty acid, ALT, AST; tissue TBARS, nitrate/nitrite levels; different fat pads and relative liver weight; and significant decrease in food intake (g and kcal), serum HDL and tissue glutathione levels in HFD control rats. Treatment with B. ceiba extract and Gemfibrozil significantly attenuated these HFD induced changes, as compared to HFD control. The effect of B. ceiba 200 and 400 mg/kg was more pronounced in comparison to Gemfibrozil. Conclusion: On the basis of results obtained, it may be concluded that the methanolic extract of stem bark of Bombax ceiba has significant ameliorative potential against HFD induced obesity in rats, possibly through modulation of FAS and PTP-1B signaling due to the presence of flavonoids and lupeol.

Keywords: obesity, Bombax ceiba, free fatty acid, protein tyrosine phosphatase-1B, fatty acid synthase

Procedia PDF Downloads 381
13462 Input Data Balancing in a Neural Network PM-10 Forecasting System

Authors: Suk-Hyun Yu, Heeyong Kwon

Abstract:

Recently PM-10 has become a social and global issue. It is one of major air pollutants which affect human health. Therefore, it needs to be forecasted rapidly and precisely. However, PM-10 comes from various emission sources, and its level of concentration is largely dependent on meteorological and geographical factors of local and global region, so the forecasting of PM-10 concentration is very difficult. Neural network model can be used in the case. But, there are few cases of high concentration PM-10. It makes the learning of the neural network model difficult. In this paper, we suggest a simple input balancing method when the data distribution is uneven. It is based on the probability of appearance of the data. Experimental results show that the input balancing makes the neural networks’ learning easy and improves the forecasting rates.

Keywords: artificial intelligence, air quality prediction, neural networks, pattern recognition, PM-10

Procedia PDF Downloads 217
13461 Statistical Convergence for the Approximation of Linear Positive Operators

Authors: Neha Bhardwaj

Abstract:

In this paper, we consider positive linear operators and study the Voronovskaya type result of the operator then obtain an error estimate in terms of the higher order modulus of continuity of the function being approximated and its A-statistical convergence. Also, we compute the corresponding rate of A-statistical convergence for the linear positive operators.

Keywords: Poisson distribution, Voronovskaya, modulus of continuity, a-statistical convergence

Procedia PDF Downloads 313
13460 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 203
13459 Measurement of Fatty Acid Changes in Post-Mortem Belowground Carcass (Sus-scrofa) Decomposition: A Semi-Quantitative Methodology for Determining the Post-Mortem Interval

Authors: Nada R. Abuknesha, John P. Morgan, Andrew J. Searle

Abstract:

Information regarding post-mortem interval (PMI) in criminal investigations is vital to establish a time frame when reconstructing events. PMI is defined as the time period that has elapsed between the occurrence of death and the discovery of the corpse. Adipocere, commonly referred to as ‘grave-wax’, is formed when post-mortem adipose tissue is converted into a solid material that is heavily comprised of fatty acids. Adipocere is of interest to forensic anthropologists, as its formation is able to slow down the decomposition process. Therefore, analysing the changes in the patterns of fatty acids during the early decomposition process may be able to estimate the period of burial, and hence the PMI. The current study concerned the investigation of the fatty acid composition and patterns in buried pig fat tissue. This was in an attempt to determine whether particular patterns of fatty acid composition can be shown to be associated with the duration of the burial, and hence may be used to estimate PMI. The use of adipose tissue from the abdominal region of domestic pigs (Sus-scrofa), was used to model the human decomposition process. 17 x 20cm piece of pork belly was buried in a shallow artificial grave, and weekly samples (n=3) from the buried pig fat tissue were collected over an 11-week period. Marker fatty acids: palmitic (C16:0), oleic (C18:1n-9) and linoleic (C18:2n-6) acid were extracted from the buried pig fat tissue and analysed as fatty acid methyl esters using the gas chromatography system. Levels of the marker fatty acids were quantified from their respective standards. The concentrations of C16:0 (69.2 mg/mL) and C18:1n-9 (44.3 mg/mL) from time zero exhibited significant fluctuations during the burial period. Levels rose (116 and 60.2 mg/mL, respectively) and fell starting from the second week to reach 19.3 and 18.3 mg/mL, respectively at week 6. Levels showed another increase at week 9 (66.3 and 44.1 mg/mL, respectively) followed by gradual decrease at week 10 (20.4 and 18.5 mg/mL, respectively). A sharp increase was observed in the final week (131.2 and 61.1 mg/mL, respectively). Conversely, the levels of C18:2n-6 remained more or less constant throughout the study. In addition to fluctuations in the concentrations, several new fatty acids appeared in the latter weeks. Other fatty acids which were detectable in the time zero sample, were lost in the latter weeks. There are several probable opportunities to utilise fatty acid analysis as a basic technique for approximating PMI: the quantification of marker fatty acids and the detection of selected fatty acids that either disappear or appear during the burial period. This pilot study indicates that this may be a potential semi-quantitative methodology for determining the PMI. Ideally, the analysis of particular fatty acid patterns in the early stages of decomposition could be an additional tool to the already available techniques or methods in improving the overall processes in estimating PMI of a corpse.

Keywords: adipocere, fatty acids, gas chromatography, post-mortem interval

Procedia PDF Downloads 115
13458 Servitization in Machine and Plant Engineering: Leveraging Generative AI for Effective Product Portfolio Management Amidst Disruptive Innovations

Authors: Till Gramberg

Abstract:

In the dynamic world of machine and plant engineering, stagnation in the growth of new product sales compels companies to reconsider their business models. The increasing shift toward service orientation, known as "servitization," along with challenges posed by digitalization and sustainability, necessitates an adaptation of product portfolio management (PPM). Against this backdrop, this study investigates the current challenges and requirements of PPM in this industrial context and develops a framework for the application of generative artificial intelligence (AI) to enhance agility and efficiency in PPM processes. The research approach of this study is based on a mixed-method design. Initially, qualitative interviews with industry experts were conducted to gain a deep understanding of the specific challenges and requirements in PPM. These interviews were analyzed using the Gioia method, painting a detailed picture of the existing issues and needs within the sector. This was complemented by a quantitative online survey. The combination of qualitative and quantitative research enabled a comprehensive understanding of the current challenges in the practical application of machine and plant engineering PPM. Based on these insights, a specific framework for the application of generative AI in PPM was developed. This framework aims to assist companies in implementing faster and more agile processes, systematically integrating dynamic requirements from trends such as digitalization and sustainability into their PPM process. Utilizing generative AI technologies, companies can more quickly identify and respond to trends and market changes, allowing for a more efficient and targeted adaptation of the product portfolio. The study emphasizes the importance of an agile and reactive approach to PPM in a rapidly changing environment. It demonstrates how generative AI can serve as a powerful tool to manage the complexity of a diversified and continually evolving product portfolio. The developed framework offers practical guidelines and strategies for companies to improve their PPM processes by leveraging the latest technological advancements while maintaining ecological and social responsibility. This paper significantly contributes to deepening the understanding of the application of generative AI in PPM and provides a framework for companies to manage their product portfolios more effectively and adapt to changing market conditions. The findings underscore the relevance of continuous adaptation and innovation in PPM strategies and demonstrate the potential of generative AI for proactive and future-oriented business management.

Keywords: servitization, product portfolio management, generative AI, disruptive innovation, machine and plant engineering

Procedia PDF Downloads 58
13457 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia

Authors: Rohan Bhasin

Abstract:

Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.

Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM

Procedia PDF Downloads 149
13456 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers

Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya

Abstract:

In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.

Keywords: IVF, embryo, machine learning, time-lapse imaging data

Procedia PDF Downloads 80
13455 A Hybrid Block Multistep Method for Direct Numerical Integration of Fourth Order Initial Value Problems

Authors: Adamu S. Salawu, Ibrahim O. Isah

Abstract:

Direct solution to several forms of fourth-order ordinary differential equations is not easily obtained without first reducing them to a system of first-order equations. Thus, numerical methods are being developed with the underlying techniques in the literature, which seeks to approximate some classes of fourth-order initial value problems with admissible error bounds. Multistep methods present a great advantage of the ease of implementation but with a setback of several functions evaluation for every stage of implementation. However, hybrid methods conventionally show a slightly higher order of truncation for any k-step linear multistep method, with the possibility of obtaining solutions at off mesh points within the interval of solution. In the light of the foregoing, we propose the continuous form of a hybrid multistep method with Chebyshev polynomial as a basis function for the numerical integration of fourth-order initial value problems of ordinary differential equations. The basis function is interpolated and collocated at some points on the interval [0, 2] to yield a system of equations, which is solved to obtain the unknowns of the approximating polynomial. The continuous form obtained, its first and second derivatives are evaluated at carefully chosen points to obtain the proposed block method needed to directly approximate fourth-order initial value problems. The method is analyzed for convergence. Implementation of the method is done by conducting numerical experiments on some test problems. The outcome of the implementation of the method suggests that the method performs well on problems with oscillatory or trigonometric terms since the approximations at several points on the solution domain did not deviate too far from the theoretical solutions. The method also shows better performance compared with an existing hybrid method when implemented on a larger interval of solution.

Keywords: Chebyshev polynomial, collocation, hybrid multistep method, initial value problems, interpolation

Procedia PDF Downloads 112