Search results for: block linear multistep methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18436

Search results for: block linear multistep methods

14296 Coal Fly Ash Based Ceramic Membrane for Water Purification via Ultrafiltration

Authors: Obsi Terfasa, Bhanupriya Das, Shiao-Shing Chen

Abstract:

Converting coal fly ash (CFA) waste into ceramic membranes presents a promising alternative to traditional disposal methods, offering potential economic and environmental advantages that warrant further investigation. This research focuses on the creation of ceramic membranes exclusively from CFA using a uniaxial compaction technique. The membranes' properties were examined through various analytical methods: Scanning Electron Microscopy (SEM) revealed a porous and flawless membrane surface, X-Ray Diffraction (XRD) identified mullite and quartz crystalline structures, and Fourier-Transform Infrared Spectroscopy (FTIR) characterized the membrane's functional groups. Thermogravimetric analysis (TGA) determined the ideal sintering temperature to be 800°C. To evaluate its separation capabilities, the synthesized membrane was tested on wastewater from denim jeans production at 0.2 bar pressure. The results were impressive, with 97.42% removal of Chemical Oxygen Demand (COD), 95% color elimination, and a pure water flux of 4.5 Lm⁻²h⁻¹bar⁻¹. These findings suggest that CFA, a byproduct of thermal power plants, can be effectively repurposed to produce ultrafiltration membranes suitable for various industrial purification and separations.

Keywords: wastewater treatment, separator, coal fly ash, ceramic membrane, ultrafiltration

Procedia PDF Downloads 12
14295 Quantitative Wide-Field Swept-Source Optical Coherence Tomography Angiography and Visual Outcomes in Retinal Artery Occlusion

Authors: Yifan Lu, Ying Cui, Ying Zhu, Edward S. Lu, Rebecca Zeng, Rohan Bajaj, Raviv Katz, Rongrong Le, Jay C. Wang, John B. Miller

Abstract:

Purpose: Retinal artery occlusion (RAO) is an ophthalmic emergency that can lead to poor visual outcome and is associated with an increased risk of cerebral stroke and cardiovascular events. Fluorescein angiography (FA) is the traditional diagnostic tool for RAO; however, wide-field swept-source optical coherence tomography angiography (WF SS-OCTA), as a nascent imaging technology, is able to provide quick and non-invasive angiographic information with a wide field of view. In this study, we looked for associations between OCT-A vascular metrics and visual acuity in patients with prior diagnosis of RAO. Methods: Patients with diagnoses of central retinal artery occlusion (CRAO) or branched retinal artery occlusion (BRAO) were included. A 6mm x 6mm Angio and a 15mm x 15mm AngioPlex Montage OCT-A image were obtained for both eyes in each patient using the Zeiss Plex Elite 9000 WF SS-OCTA device. Each 6mm x 6mm image was divided into nine Early Treatment Diabetic Retinopathy Study (ETDRS) subfields. The average measurement of the central foveal subfield, inner ring, and outer ring was calculated for each parameter. Non-perfusion area (NPA) was manually measured using 15mm x 15mm Montage images. A linear regression model was utilized to identify a correlation between the imaging metrics and visual acuity. A P-value less than 0.05 was considered to be statistically significant. Results: Twenty-five subjects were included in the study. For RAO eyes, there was a statistically significant negative correlation between vision and retinal thickness as well as superficial capillary plexus vessel density (SCP VD). A negative correlation was found between vision and deep capillary plexus vessel density (DCP VD) without statistical significance. There was a positive correlation between vision and choroidal thickness as well as choroidal volume without statistical significance. No statistically significant correlation was found between vision and the above metrics in contralateral eyes. For NPA measurements, no significant correlation was found between vision and NPA. Conclusions: This is the first study to our best knowledge to investigate the utility of WF SS-OCTA in RAO and to demonstrate correlations between various retinal vascular imaging metrics and visual outcomes. Further investigations should explore the associations between these imaging findings and cardiovascular risk as RAO patients are at elevated risk for symptomatic stroke. The results of this study provide a basis to understand the structural changes involved in visual outcomes in RAO. Furthermore, they may help guide management of RAO and prevention of cerebral stroke and cardiovascular accidents in patients with RAO.

Keywords: OCTA, swept-source OCT, retinal artery occlusion, Zeiss Plex Elite

Procedia PDF Downloads 132
14294 About the Interface Bonding Safety of Adhesively Bonded Concrete Joints Under Cracking: A Fracture Energetic Approach

Authors: Brandtner-Hafner Martin

Abstract:

Adhesives are increasingly being used in the construction sector. On the one hand, this concerns dowel reinforcements using chemical anchors. On the other hand, the sealing and repair of cracks in structural concrete components are still on the rise. In the field of bonding, the interface between the joined materials is the most critical area. Therefore, it is of immense importance to characterize and investigate this section sufficiently by fracture analysis. Since standardized mechanical test methods are not sufficiently capable of doing this, recourse is made to an innovative concept based on fracture energy. Therefore, a series of experimental tests were performed using the so-called GF-principle to study the interface bonding safety of adhesively bonded concrete joints. Several different structural adhesive systems based on epoxy, CA/A hybrid, PUR, MS polymer, dispersion, and acrylate were selected for bonding concrete substrates. The results show that stable crack propagation and prevention of uncontrolled failure in bonded concrete joints depend very much on the adhesive system used, and only fracture analytical evaluation methods can provide empirical information on this.

Keywords: interface bonding safety, adhesively bonded concrete joints, GF-principle, fracture analysis

Procedia PDF Downloads 298
14293 A Clustering-Based Approach for Weblog Data Cleaning

Authors: Amine Ganibardi, Cherif Arab Ali

Abstract:

This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.

Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data

Procedia PDF Downloads 167
14292 Safety of Ports, Harbours, Marine Terminals: Application of Quantitative Risk Assessment

Authors: Dipak Sonawane, Sudarshan Daga, Somesh Gupta

Abstract:

Quantitative risk assessment (QRA) is a very precise and consistent approach to defining the likelihood, consequence and severity of a major incident/accident. A variety of hazardous cargoes in bulk, such as hydrocarbons and flammable/toxic chemicals, are handled at various ports. It is well known that most of the operations are hazardous, having the potential of damaging property, causing injury/loss of life and, in some cases, the threat of environmental damage. In order to ensure adequate safety towards life, environment and property, the application of scientific methods such as QRA is inevitable. By means of these methods, comprehensive hazard identification, risk assessment and appropriate implementation of Risk Control measures can be carried out. In this paper, the authors, based on their extensive experience in Risk Analysis for ports and harbors, have exhibited how QRA can be used in practice to minimize and contain risk to tolerable levels. A specific case involving the operation for unloading of hydrocarbon at a port is presented. The exercise provides confidence that the method of QRA, as proposed by the authors, can be used appropriately for the identification of hazards and risk assessment of Ports and Terminals.

Keywords: quantitative risk assessment, hazard assessment, consequence analysis, individual risk, societal risk

Procedia PDF Downloads 76
14291 Investigation of Drought Resistance in Iranian Sesamum Germpelasm

Authors: Fatemeh Najafi

Abstract:

The major stress factor limiting crop growth and development of sesame (Sesamum indicum L.) is drought stress in arid and semiarid regions of the world. For this study the effects of water stress on some qualitative and quantitative traits in sesame germplasm was conducted in the Research Farm of Seed and Plant Improvement Institute, Karaj, in the crop year. Genotypes in a randomized complete block design with three replications in two environments (moisture stress and normal) were studied in regard of the seed weight, capsule weight, grain yield, biomass, plant height, number of capsules per plant, etc. The characteristics were evaluated based on the combined analysis. Irrigation was based on first class evaporation basin. After flowering stage drought stress was applied. The water deficit reduced growth period. Days to reach full ripening decreased so that the reduction was significant at the five percent level. Drought stress reduces yield and plant biomass. Genotypes based on combined analysis of these two traits were significant at the one percent level. Genotypes differ in terms of yield stress in terms of density plots, grain yield, days to first flowering and days to the half of the cap on the confidence level of five percent and traits of days to emergence of the first capsule and days to reach full ripening at the one percent level were significant. Other traits were not significant. The correlation of traits in circumstances of stress the number of seeds per capsule has the greatest impact on performance. The sensitivity and stress tolerance index was calculated. Based on the indicators, (Fars variety) and variety Karaj were identified as the most tolerant genotypes among the studied genotypes to drought stress. The highest sensitivity indicator of stress was related to genotype (FARS).

Keywords: sesamum, drought, stress, germplasm, resistance

Procedia PDF Downloads 66
14290 Inductive Grammar, Student-Centered Reading, and Interactive Poetry: The Effects of Teaching English with Fun in Schools of Two Villages in Lebanon

Authors: Talar Agopian

Abstract:

Teaching English as a Second Language (ESL) is a common practice in many Lebanese schools. However, ESL teaching is done in traditional ways. Methods such as constructivism are seldom used, especially in villages. Here lies the significance of this research which joins constructivism and Piaget’s theory of cognitive development in ESL classes in Lebanese villages. The purpose of the present study is to explore the effects of applying constructivist student-centered strategies in teaching grammar, reading comprehension, and poetry on students in elementary ESL classes in two villages in Lebanon, Zefta in South Lebanon and Boqaata in Mount Lebanon. 20 English teachers participated in a training titled “Teaching English with Fun”, which focused on strategies that create a student-centered class where active learning takes place and there is increased learner engagement and autonomy. The training covered three main areas in teaching English: grammar, reading comprehension, and poetry. After participating in the training, the teachers applied the new strategies and methods in their ESL classes. The methodology comprised two phases: in phase one, practice-based research was conducted as the teachers attended the training and applied the constructivist strategies in their respective ESL classes. Phase two included the reflections of the teachers on the effects of the application of constructivist strategies. The results revealed the educational benefits of constructivist student-centered strategies; the students of teachers who applied these strategies showed improved engagement, positive attitudes towards poetry, increased motivation, and a better sense of autonomy. Future research is required in applying constructivist methods in the areas of writing, spelling, and vocabulary in ESL classrooms of Lebanese villages.

Keywords: active learning, constructivism, learner engagement, student-centered strategies

Procedia PDF Downloads 133
14289 Evaluation of Dynamic Log Files for Different Dose Rates in IMRT Plans

Authors: Saad Bin Saeed, Fayzan Ahmed, Shahbaz Ahmed, Amjad Hussain

Abstract:

The aim of this study is to evaluate dynamic log files (Dynalogs) at different dose rates by dose-volume histograms (DVH) and used as a (QA) procedure of IMRT. Seven patients of phase one head and neck cancer with similar OAR`s are selected randomly. Reference plans of dose rate 300 and 600 MU/Min with prescribed dose of 50Gy in 25 fractions for each patient is made. Dynalogs produced by delivery of reference plans processed by in-house MATLAB program which produces new field files contain actual positions of multi-leaf collimators (MLC`s) instead of planned positions in reference plans. Copies of reference plans are used to import new field files generated by MATLAB program and renamed as Dyn.plan. After dose calculations of Dyn.plans for different dose rates, DVH, and multiple linear regression tools are used to evaluate reference and Dyn.plans. The results indicate good agreement of correlation between different dose rate plans. The maximum dose difference among PTV and OAR`s are found to be less than 5% and 9% respectively. The study indicates the potential of dynalogs to be used as patient-specific QA of IMRT at different dose rate.

Keywords: IMRT, dynalogs, dose rate, DVH

Procedia PDF Downloads 527
14288 Urban Corridor Management Strategy Based on Intelligent Transportation System

Authors: Sourabh Jain, Sukhvir Singh Jain, Gaurav V. Jain

Abstract:

Intelligent Transportation System (ITS) is the application of technology for developing a user–friendly transportation system for urban areas in developing countries. The goal of urban corridor management using ITS in road transport is to achieve improvements in mobility, safety, and the productivity of the transportation system within the available facilities through the integrated application of advanced monitoring, communications, computer, display, and control process technologies, both in the vehicle and on the road. This paper attempts to present the past studies regarding several ITS available that have been successfully deployed in urban corridors of India and abroad, and to know about the current scenario and the methodology considered for planning, design, and operation of Traffic Management Systems. This paper also presents the endeavor that was made to interpret and figure out the performance of the 27.4 Km long study corridor having eight intersections and four flyovers. The corridor consisting of 6 lanes as well as 8 lanes divided road network. Two categories of data were collected on February 2016 such as traffic data (traffic volume, spot speed, delay) and road characteristics data (no. of lanes, lane width, bus stops, mid-block sections, intersections, flyovers). The instruments used for collecting the data were video camera, radar gun, mobile GPS and stopwatch. From analysis, the performance interpretations incorporated were identification of peak hours and off peak hours, congestion and level of service (LOS) at mid blocks, delay followed by the plotting speed contours and recommending urban corridor management strategies. From the analysis, it is found that ITS based urban corridor management strategies will be useful to reduce congestion, fuel consumption and pollution so as to provide comfort and efficiency to the users. The paper presented urban corridor management strategies based on sensors incorporated in both vehicles and on the roads.

Keywords: congestion, ITS strategies, mobility, safety

Procedia PDF Downloads 438
14287 Investigation of Geothermal Gradient of the Niger Delta from Recent Studies

Authors: Adedapo Jepson Olumide, Kurowska Ewa, K. Schoeneich, Ikpokonte A. Enoch

Abstract:

In this paper, subsurface temperature measured from continuous temperature logs were used to determine the geothermal gradient of NigerDelta sedimentary basin. The measured temperatures were corrected to the true subsurface temperatures by applying the American Association of Petroleum Resources (AAPG) correction factor, borehole temperature correction factor with La Max’s correction factor and Zeta Utilities borehole correction factor. Geothermal gradient in this basin ranges from 1.20C to 7.560C/100m. Six geothermal anomalies centres were observed at depth in the southern parts of the Abakaliki anticlinorium around Onitsha, Ihiala, Umuaha area and named A1 to A6 while two more centre appeared at depth of 3500m and 4000m named A7 and A8 respectively. Anomaly A1 describes the southern end of the Abakaliki anticlinorium and extends southwards, anomaly A2 to A5 were found associated with a NW-SE structural alignment of the Calabar hinge line with structures describing the edge of the Niger Delta basin with the basement block of the Oban massif. Anomaly A6 locates in the south-eastern part of the basin offshore while A7 and A8 are located in the south western part of the basin offshore. At the average exploratory depth of 3500m, the geothermal gradient values for these anomalies A1, A2, A3, A4, A5, A6, A7, and A8 are 6.50C/100m, 1.750C/100m, 7.50C/100m, 1.250C/100m, 6.50C/100m, 5.50C/100m, 60C/100m, and 2.250C/100m respectively. Anomaly A8 area may yield higher thermal value at greater depth than 3500m. These results show that anomalies areas of A1, A3, A5, A6 and A7 are potentially prospective and explorable for geothermal energy using abandoned oil wells in the study area. Anomalies A1, A3.A5, A6 occur at areas where drilled boreholes were not exploitable for oil and gas but for the remaining areas where wells are so exploitable there appears no geothermal anomaly. Geothermal energy is environmentally friendly, clean and reversible.

Keywords: temperature logs, geothermal gradient anomalies, alternative energy, Niger delta basin

Procedia PDF Downloads 271
14286 Preparation of Composite Alginate/Perlite Beads for Pb (II) Removal in Aqueous Solution

Authors: Hasan Türe, Kader Terzioglu, Evren Tunca

Abstract:

Contamination of aqueous environment by heavy metal ions is a serious and complex problem, owing to their hazards to human being and ecological systems. The treatment methods utilized for removing metal ions from aqueous solution include membrane separation, ion exchange and chemical precipitation. However, these methods are limited by high operational cost. Recently, biobased beads are considered as promising biosorbent to remove heavy metal ions from water. The aim of present study was to characterize the alginate/perlite composite beads and to investigate the adsorption performance of obtained beads for removing Pb (II) from aqueous solution. Alginate beads were synthesized by ionic gelation methods and different amount of perlite (aljinate:perlite=1, 2, 3, 4, 5 wt./wt.) was incorporated into alginate beads. Samples were characterized by means of X-ray diffraction (XRD), thermogravimetric analysis (TGA), scanning electron microscopy (SEM). The effects of perlite level, the initial concentration of Pb (II), initial pH value of Pb(II) solution and effect of contact time on the adsorption capacity of beads were investigated by using batch method. XRD analysis indicated that perlite includes silicon or silicon and aluminum bearing crystalline phase. The diffraction pattern of perlite containing beads is similar to that of that perlite powder with reduced intensity. SEM analysis revealed that perlite was embedded into alginate polymer and SEM-EDX (Energy-Dispersive X-ray) showed that composite beads (aljinate:perlite=1) composed of C (41.93 wt.%,), O (43.64 wt.%), Na (10.20 wt.%), Al (0.74 wt.%), Si (2.72 wt.%) ve K (0.77 wt.%). According to TGA analysis, incorporation of perlite into beads significantly improved the thermal stability of the samples. Batch experiment indicated that optimum pH value for Pb (II) adsorption was found at pH=7 with 1 hour contact time. It was also found that the adsorption capacity of beads decreased with increases in perlite concentration. The results implied that alginate/perlite composite beads could be used as promising adsorbents for the removal of Pb (II) from wastewater. Acknowledgement: This study was supported by TUBITAK (Project No: 214Z146).

Keywords: alginate, adsorption, beads, perlite

Procedia PDF Downloads 281
14285 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 135
14284 Critical Review of Oceanic and Geological Storage of Carbon Sequestration

Authors: Milad Nooshadi, Alessandro Manzardo

Abstract:

CO₂ emissions in the atmosphere continue to rise, mostly as a result of the combustion of fossil fuels. CO₂ injection into the oceans and geological formation as a process of physical carbon capture are two of the most promising emerging strategies for mitigating climate change and global warming. The purpose of this research is to evaluate the two mentioned methods of CO₂ sequestration and to assess information on previous and current advancements, limitations, and uncertainties associated with carbon sequestration in order to identify possible prospects for ensuring the timely implementation of the technology, such as determining how governments and companies can gain a better understanding of CO₂ storage in terms of which media have the most applicable capacity, which type of injection has the fewer environmental impact, and how much carbon sequestration and storage will cost. The behavior of several forms is characterized as a near field, a far field, and a see-floor in ocean storage, and three medias in geological formations as an oil and gas reservoir, a saline aquifer, and a coal bed. To determine the capacity of various forms of media, an analysis of some models and practical experiments are necessary. Additionally, as a major component of sequestration, the various injection methods into diverse media and their monitoring are associated with a variety of environmental impacts and financial consequences.

Keywords: carbon sequestration, ocean storage, geologic storage, carbon transportation

Procedia PDF Downloads 94
14283 Hybrid Direct Numerical Simulation and Large Eddy Simulating Wall Models Approach for the Analysis of Turbulence Entropy

Authors: Samuel Ahamefula

Abstract:

Turbulent motion is a highly nonlinear and complex phenomenon, and its modelling is still very challenging. In this study, we developed a hybrid computational approach to accurately simulate fluid turbulence phenomenon. The focus is coupling and transitioning between Direct Numerical Simulation (DNS) and Large Eddy Simulating Wall Models (LES-WM) regions. In the framework, high-order fidelity fluid dynamical methods are utilized to simulate the unsteady compressible Navier-Stokes equations in the Eulerian format on the unstructured moving grids. The coupling and transitioning of DNS and LES-WM are conducted through the linearly staggered Dirichlet-Neumann coupling scheme. The high-fidelity framework is verified and validated based on namely, DNS ability for capture full range of turbulent scales, giving accurate results and LES-WM efficiency in simulating near-wall turbulent boundary layer by using wall models.

Keywords: computational methods, turbulence modelling, turbulence entropy, navier-stokes equations

Procedia PDF Downloads 93
14282 Phylogenetic Analysis of Klebsiella Species from Clinical Specimens from Nelson Mandela Academic Hospital in Mthatha, South Africa

Authors: Sandeep Vasaikar, Lary Obi

Abstract:

Rapid and discriminative genotyping methods are useful for determining the clonality of the isolates in nosocomial or household outbreaks. Multilocus sequence typing (MLST) is a nucleotide sequence-based approach for characterising bacterial isolates. The genetic diversity and the clinical relevance of the drug-resistant Klebsiella isolates from Mthatha are largely unknown. For this reason, prospective, experimental study of the molecular epidemiology of Klebsiella isolates from patients being treated in Mthatha over a three-year period was analysed. Methodology: PCR amplification and sequencing of the drug-resistance-associated genes, and multilocus sequence typing (MLST) using 7 housekeeping genes mdh, pgi, infB, FusAR, phoE, gapA and rpoB were conducted. A total of 32 isolates were analysed. Results: The percentages of multidrug-resistant (MDR), extensively drug-resistance (XDR) and pandrug-resistant (PDR) isolates were; MDR 65.6 % (21) and XDR and PDR with 0 % each. In this study, K. pneumoniae was 19/32 (59.4 %). MLST results showed 22 sequence types (STs) were identified, which were further separated by Maximum Parsimony into 10 clonal complexes and 12 singletons. The most dominant group was Klebsiella pneumoniae with 23/32 (71.8 %) isolates, Klebsiella oxytoca as a second group with 2/32 (6.25 %) isolates, and a single (3.1 %) K. varricola as a third group while 6 isolates were of unknown sequences. Conclusions/significance: A phylogenetic analysis of the concatenated sequences of the 7 housekeeping genes showed that strains of K. pneumoniae form a distinct lineage within the genus Klebsiella, with K. oxytoca and K. varricola its nearest phylogenetic neighbours. With the analysis of 7 genes were determined 1 K. variicola, which was mistakenly identified as K. pneumoniae by phenotypic methods. Two misidentifications of K. oxytoca were found when phenotypic methods were used. No significant differences were observed between ESBL blaCTX-M, blaTEM and blaSHV groups in the distribution of Sequence types (STs) or Clonal complexes (CCs).

Keywords: phylogenetic analysis, phylogeny, klebsiella phylogenetic, klebsiella

Procedia PDF Downloads 361
14281 HPA Pre-Distorter Based on Neural Networks for 5G Satellite Communications

Authors: Abdelhamid Louliej, Younes Jabrane

Abstract:

Satellites are becoming indispensable assets to fifth-generation (5G) new radio architecture, complementing wireless and terrestrial communication links. The combination of satellites and 5G architecture allows consumers to access all next-generation services anytime, anywhere, including scenarios, like traveling to remote areas (without coverage). Nevertheless, this solution faces several challenges, such as a significant propagation delay, Doppler frequency shift, and high Peak-to-Average Power Ratio (PAPR), causing signal distortion due to the non-linear saturation of the High-Power Amplifier (HPA). To compensate for HPA non-linearity in 5G satellite transmission, an efficient pre-distorter scheme using Neural Networks (NN) is proposed. To assess the proposed NN pre-distorter, two types of HPA were investigated: Travelling Wave Tube Amplifier (TWTA) and Solid-State Power Amplifier (SSPA). The results show that the NN pre-distorter design presents EVM improvement by 95.26%. NMSE and ACPR were reduced by -43,66 dB and 24.56 dBm, respectively. Moreover, the system suffers no degradation of the Bit Error Rate (BER) for TWTA and SSPA amplifiers.

Keywords: satellites, 5G, neural networks, HPA, TWTA, SSPA, EVM, NMSE, ACPR

Procedia PDF Downloads 85
14280 Digitalization, Supply Chain Integration and Financial Performance: Case of Tunisian Agro-industrial Sector

Authors: Rym Ghariani, Younes Boujelbene

Abstract:

In contemporary times, global technological advancements, particularly those in the realm of digital technology, have emerged as pivotal instruments for enterprises in fostering viable partnerships and forging meaningful alliances with other firms. The advent of these digital innovations is poised to revolutionize nearly every facet and operation within corporate entities. The primary objective of this study is to explore the correlation between digitization, integration of supply chains, and the financial efficacy of the agro-industrial sector in Tunisia. To accomplish this, data collection employed a questionnaire as the primary research instrument. Subsequently, the research queries were addressed, and hypotheses were examined by subjecting the gathered data to principal component analysis and linear regression modeling, facilitated by the utilization of SPSS26 software. The findings revealed that digitalization within the supply chain, along with external supply chain integration, exerted discernible impacts on the financial performance of the organization.

Keywords: digitalization, supply chain integration, financial performance, Tunisian agro-industrial sector

Procedia PDF Downloads 35
14279 DISGAN: Efficient Generative Adversarial Network-Based Method for Cyber-Intrusion Detection

Authors: Hongyu Chen, Li Jiang

Abstract:

Ubiquitous anomalies endanger the security of our system con- stantly. They may bring irreversible damages to the system and cause leakage of privacy. Thus, it is of vital importance to promptly detect these anomalies. Traditional supervised methods such as Decision Trees and Support Vector Machine (SVM) are used to classify normality and abnormality. However, in some case, the abnormal status are largely rarer than normal status, which leads to decision bias of these methods. Generative adversarial network (GAN) has been proposed to handle the case. With its strong generative ability, it only needs to learn the distribution of normal status, and identify the abnormal status through the gap between it and the learned distribution. Nevertheless, existing GAN-based models are not suitable to process data with discrete values, leading to immense degradation of detection performance. To cope with the discrete features, in this paper, we propose an efficient GAN-based model with specifically-designed loss function. Experiment results show that our model outperforms state-of-the-art models on discrete dataset and remarkably reduce the overhead.

Keywords: GAN, discrete feature, Wasserstein distance, multiple intermediate layers

Procedia PDF Downloads 122
14278 An Advanced Approach to Detect and Enumerate Soil-Transmitted Helminth Ova from Wastewater

Authors: Vivek B. Ravindran, Aravind Surapaneni, Rebecca Traub, Sarvesh K. Soni, Andrew S. Ball

Abstract:

Parasitic diseases have a devastating, long-term impact on human health and welfare. More than two billion people are infected with soil-transmitted helminths (STHs), including the roundworms (Ascaris), hookworms (Necator and Ancylostoma) and whipworm (Trichuris) with majority occurring in the tropical and subtropical regions of the world. Despite its low prevalence in developed countries, the removal of STHs from wastewater remains crucial to allow the safe use of sludge or recycled water in agriculture. Conventional methods such as incubation and optical microscopy are cumbersome; consequently, the results drastically vary from person-to-person observing the ova (eggs) under microscope. Although PCR-based methods are an alternative to conventional techniques, it lacks the ability to distinguish between viable and non-viable helminth ova. As a result, wastewater treatment industries are in major need for radically new and innovative tools to detect and quantify STHs eggs with precision, accuracy and being cost-effective. In our study, we focus on the following novel and innovative techniques: -Recombinase polymerase amplification and Surface enhanced Raman spectroscopy (RPA-SERS) based detection of helminth ova. -Use of metal nanoparticles and their relative nanozyme activity. -Colorimetric detection, differentiation and enumeration of genera of helminth ova using hydrolytic enzymes (chitinase and lipase). -Propidium monoazide (PMA)-qPCR to detect viable helminth ova. -Modified assay to recover and enumerate helminth eggs from fresh raw sewage. -Transcriptome analysis of ascaris ova in fresh raw sewage. The aforementioned techniques have the potential to replace current conventional and molecular methods thereby producing a standard protocol for the determination and enumeration of helminth ova in sewage sludge.

Keywords: colorimetry, helminth, PMA-QPCR, nanoparticles, RPA, viable

Procedia PDF Downloads 297
14277 Tyrosine Rich Fraction as an Immunomodulatory Agent from Ficus Religiosa Bark

Authors: S. A. Nirmal, G. S. Asane, S. C. Pal, S. C. Mandal

Abstract:

Objective: Ficus religiosa Linn (Moraceae) is being used in traditional medicine to improve immunity hence present work was undertaken to validate this use scientifically. Material and Methods: Dried, powdered bark of F. religiosa was extracted successively using petroleum ether and 70% ethanol in soxhlet extractor. The extracts obtained were screened for immunomodulatory activity by delayed type hypersensitivity (DTH), neutrophil adhesion test and cyclophosphamide-induced neutropenia in Swiss albino mice at the dose of 50 and 100 mg/kg, i.p. 70% ethanol extract showed significant immunostimulant activity hence subjected to column chromatography to produce tyrosine rich fraction (TRF). TRF obtained was screened for immunomodulatory activity by above methods at the dose of 10 mg/kg, i.p. Results: TRF showed potentiation of DTH response in terms of significant increase in the mean difference in foot-pad thickness and it significantly increased neutrophil adhesion to nylon fibers by 48.20%. Percentage reduction in total leukocyte count and neutrophil by TRF was found to be 43.85% and 18.72%, respectively. Conclusion: Immunostimulant activity of TRF was more pronounced and thus it has great potential as a source for natural health products.

Keywords: Ficus religiosa, immunomodulatory, cyclophosphamide, neutropenia

Procedia PDF Downloads 439
14276 An Accurate Computation of 2D Zernike Moments via Fast Fourier Transform

Authors: Mohammed S. Al-Rawi, J. Bastos, J. Rodriguez

Abstract:

Object detection and object recognition are essential components of every computer vision system. Despite the high computational complexity and other problems related to numerical stability and accuracy, Zernike moments of 2D images (ZMs) have shown resilience when used in object recognition and have been used in various image analysis applications. In this work, we propose a novel method for computing ZMs via Fast Fourier Transform (FFT). Notably, this is the first algorithm that can generate ZMs up to extremely high orders accurately, e.g., it can be used to generate ZMs for orders up to 1000 or even higher. Furthermore, the proposed method is also simpler and faster than the other methods due to the availability of FFT software and/or hardware. The accuracies and numerical stability of ZMs computed via FFT have been confirmed using the orthogonality property. We also introduce normalizing ZMs with Neumann factor when the image is embedded in a larger grid, and color image reconstruction based on RGB normalization of the reconstructed images. Astonishingly, higher-order image reconstruction experiments show that the proposed methods are superior, both quantitatively and subjectively, compared to the q-recursive method.

Keywords: Chebyshev polynomial, fourier transform, fast algorithms, image recognition, pseudo Zernike moments, Zernike moments

Procedia PDF Downloads 257
14275 A Nonlinear Feature Selection Method for Hyperspectral Image Classification

Authors: Pei-Jyun Hsieh, Cheng-Hsuan Li, Bor-Chen Kuo

Abstract:

For hyperspectral image classification, feature reduction is an important pre-processing for avoiding the Hughes phenomena due to the difficulty for collecting training samples. Hence, lots of researches developed feature selection methods such as F-score, HSIC (Hilbert-Schmidt Independence Criterion), and etc., to improve hyperspectral image classification. However, most of them only consider the class separability in the original space, i.e., a linear class separability. In this study, we proposed a nonlinear class separability measure based on kernel trick for selecting an appropriate feature subset. The proposed nonlinear class separability was formed by a generalized RBF kernel with different bandwidths with respect to different features. Moreover, it considered the within-class separability and the between-class separability. A genetic algorithm was applied to tune these bandwidths such that the smallest with-class separability and the largest between-class separability simultaneously. This indicates the corresponding feature space is more suitable for classification. In addition, the corresponding nonlinear classification boundary can separate classes very well. These optimal bandwidths also show the importance of bands for hyperspectral image classification. The reciprocals of these bandwidths can be viewed as weights of bands. The smaller bandwidth, the larger weight of the band, and the more importance for classification. Hence, the descending order of the reciprocals of the bands gives an order for selecting the appropriate feature subsets. In the experiments, three hyperspectral image data sets, the Indian Pine Site data set, the PAVIA data set, and the Salinas A data set, were used to demonstrate the selected feature subsets by the proposed nonlinear feature selection method are more appropriate for hyperspectral image classification. Only ten percent of samples were randomly selected to form the training dataset. All non-background samples were used to form the testing dataset. The support vector machine was applied to classify these testing samples based on selected feature subsets. According to the experiments on the Indian Pine Site data set with 220 bands, the highest accuracies by applying the proposed method, F-score, and HSIC are 0.8795, 0.8795, and 0.87404, respectively. However, the proposed method selects 158 features. F-score and HSIC select 168 features and 217 features, respectively. Moreover, the classification accuracies increase dramatically only using first few features. The classification accuracies with respect to feature subsets of 10 features, 20 features, 50 features, and 110 features are 0.69587, 0.7348, 0.79217, and 0.84164, respectively. Furthermore, only using half selected features (110 features) of the proposed method, the corresponding classification accuracy (0.84168) is approximate to the highest classification accuracy, 0.8795. For other two hyperspectral image data sets, the PAVIA data set and Salinas A data set, we can obtain the similar results. These results illustrate our proposed method can efficiently find feature subsets to improve hyperspectral image classification. One can apply the proposed method to determine the suitable feature subset first according to specific purposes. Then researchers can only use the corresponding sensors to obtain the hyperspectral image and classify the samples. This can not only improve the classification performance but also reduce the cost for obtaining hyperspectral images.

Keywords: hyperspectral image classification, nonlinear feature selection, kernel trick, support vector machine

Procedia PDF Downloads 257
14274 The Advantages of Using DNA-Barcoding for Determining the Fraud in Seafood

Authors: Elif Tugce Aksun Tumerkan

Abstract:

Although seafood is an important part of human diet and categorized highly traded food industry internationally, it is remain overlooked generally in the global food security aspect. Food product authentication is the main interest in the aim of both avoids commercial fraud and to consider the risks that might be harmful to human health safety. In recent years, with increasing consumer demand for regarding food content and it's transparency, there are some instrumental analyses emerging for determining food fraud depend on some analytical methodologies such as proteomic and metabolomics. While, fish and seafood consumed as fresh previously, within advanced technology, processed or packaged seafood consumption have increased. After processing or packaging seafood, morphological identification is impossible when some of the external features have been removed. The main fish and seafood quality-related issues are the authentications of seafood contents such as mislabelling products which may be contaminated and replacement partly or completely, by lower quality or cheaper ones. For all mentioned reasons, truthful consistent and easily applicable analytical methods are needed for assurance the correct labelling and verifying of seafood products. DNA-barcoding methods become popular robust that used in taxonomic research for endangered or cryptic species in recent years; they are used for determining food traceability also. In this review, when comparing the other proteomic and metabolic analysis, DNA-based methods are allowing a chance to identification all type of food even as raw, spiced and processed products. This privilege caused by DNA is a comparatively stable molecule than protein and other molecules. Furthermore showing variations in sequence based on different species and founding in all organisms, make DNA-based analysis more preferable. This review was performed to clarify the main advantages of using DNA-barcoding for determining seafood fraud among other techniques.

Keywords: DNA-barcoding, genetic analysis, food fraud, mislabelling, packaged seafood

Procedia PDF Downloads 161
14273 Microstructure and Oxidation Behaviors of Al, Y Modified Silicide Coatings Prepared on an Nb-Si Based Ultrahigh Temperature Alloy

Authors: Xiping Guo, Jing Li

Abstract:

The microstructure of an Si-Al-Y co-deposition coating prepared on an Nb-Si based ultra high temperature alloy by pack cementation process at 1250°C for eight hours was studied. The results showed that the coating was composed of a (Nb,X)Si₂ (X represents Ti, Cr and Hf elements) outer layer, a (Ti,Nb)₅Si₄ middle layer and an Al, Cr-rich inner layer. For comparison, the oxidation behaviors of the coating at 800, 1050 and 1350°C were investigated respectively. Linear oxidation kinetics was found with the parabolic rate constants of 5.29×10⁻², 9×10⁻²and 5.81 mg² cm⁻⁴ h⁻¹, respectively. Catastrophic pesting oxidation has not been found at 800°C even for 100 h. The surface of the scale was covered by compact glassy SiO₂ film. The coating was able to effectively protect the Nb-Si based alloy from oxidation at 1350°C for at least 100 h. The formation process of the scale was testified following an epitaxial growth mechanism. The mechanism responsible for the oxidation behavior of the Si-Al-Y co-deposition coating at 800, 1050 and 1350°C was proposed.

Keywords: Nb-Si based ultra high temperature alloy, oxidation resistance, pack cementation, silicide coating, Al and Y modified

Procedia PDF Downloads 394
14272 Political Economy of Electronic News Media in Pakistan

Authors: Asad Ullah Khalid

Abstract:

This paper encompasses the application of the concept of political economy of mass media in Pakistan. The media has developed at a massive pace and now is considered as one of the vital parts in having better administration furthermore helps in conveying the issues identified with the government to the public. Albeit Pakistani media has gained much independence after 2003 but there are many social, political and economy factors which influence the content of the media. The study employs triangulation of quantitative and qualitative methods. In terms of methods, content analysis and interview method both are used. The content of Pakistani media is analyzed quantitatively and qualitatively. Moreover, interviews with various journalists are conducted, and their findings are disclosed in this paper. Pakistan's communication landscape is neither well documented nor well understood, leaving its public off guard with regards to reviewing the role and impact of news inflow, correspondence and media in political, economic and social life. It has been found out that on particular issues some media channels have strong affiliations with certain political parties, moreover reporting and coverage have also been affected by the factors like terrorism, state policies(written and verbal), advertising/economic and demographic factors like the composition of the population.

Keywords: political economy, news media, Pakistan, electronic news media, journalism, mass media

Procedia PDF Downloads 324
14271 Evaluation of Diagnostic Values of Culture, Rapid Urease Test, and Histopathology in the Diagnosis of Helicobacter pylori Infection and in vitro Effects of Various Antimicrobials against Helicobacter pylori

Authors: Recep Kesli, Huseyin Bilgin, Yasar Unlu, Gokhan Gungor

Abstract:

Aim: The aim of this study, was to investigate the presence of Helicobacter pylori (H. pylori) infection by culture, histology, and RUT (Rapid Urease Test) in gastric antrum biopsy samples taken from patients presented with dyspeptic complaints and to determine resistance rates of amoxicillin, clarithromycin, levofloxacin and metronidazole against the H. pylori strains by E-test. Material and Methods: A total of 278 patients who admitted to Konya Education and Research Hospital Department of Gastroenterology with dyspeptic complaints, between January 2011-July 2013, were included in the study. Microbiological and histopathological examinations of biopsy specimens taken from antrum and corpus regions were performed. The presence of H. pylori in biopsy samples was investigated by culture (Portagerm pylori-PORT PYL, Pylori agar-PYL, GENbox microaer, bioMerieux, France), histology (Giemsa, Hematoxylin and Eosin staining), and RUT(CLOtest, Cimberly-Clark, USA). Antimicrobial resistance of isolates against amoxicillin, clarithromycin, levofloxacin, and metronidazole was determined by E-test method (bioMerieux, France). As a gold standard in the diagnosis of H. pylori; it was accepted that the culture method alone was positive or both histology and RUT were positive together. Sensitivity and specificity for histology and RUT were calculated by taking the culture as a gold standard. Sensitivity and specificity for culture were also calculated by taking the co-positivity of both histology and RUT as a gold standard. Results: H. pylori was detected in 140 of 278 of patients with culture and 174 of 278 of patients with histology in the study. H. pylori positivity was also found in 191 patients with RUT. According to the gold standard criteria, a false negative result was found in 39 cases by culture method, 17 cases by histology, and 8 cases by RUT. Sensitivity and specificity of the culture, histology, and RUT methods of the patients were 76.5 % and 88.3 %, 87.8 % and 63 %, 94.2 % and 57.2 %, respectively. Antibiotic resistance was investigated by E-test in 140 H. pylori strains isolated from culture. The resistance rates of H. pylori strains to the amoxicillin, clarithromycin, levofloxacin, and metronidazole was detected as 9 (6.4 %), 22 (15.7 %), 17 (12.1 %), 57 (40.7 %), respectively. Conclusion: In our study, RUT was found to be the most sensitive, culture was the most specific test between culture, histology, and RUT methods. Although we detected the specificity of the culture method as high, its sensitivity was found to be quite low compared to other methods. The low sensitivity of H. pylori culture may be caused by the factors affect the chances of direct isolation such as spoild bacterium, difficult-to-breed microorganism, clinical sample retrieval, and transport conditions.

Keywords: antimicrobial resistance, culture, histology, H. pylori, RUT

Procedia PDF Downloads 157
14270 Studying Relationship between Local Geometry of Decision Boundary with Network Complexity for Robustness Analysis with Adversarial Perturbations

Authors: Tushar K. Routh

Abstract:

If inputs are engineered in certain manners, they can influence deep neural networks’ (DNN) performances by facilitating misclassifications, a phenomenon well-known as adversarial attacks that question networks’ vulnerability. Recent studies have unfolded the relationship between vulnerability of such networks with their complexity. In this paper, the distinctive influence of additional convolutional layers at the decision boundaries of several DNN architectures was investigated. Here, to engineer inputs from widely known image datasets like MNIST, Fashion MNIST, and Cifar 10, we have exercised One Step Spectral Attack (OSSA) and Fast Gradient Method (FGM) techniques. The aftermaths of adding layers to the robustness of the architectures have been analyzed. For reasoning, separation width from linear class partitions and local geometry (curvature) near the decision boundary have been examined. The result reveals that model complexity has significant roles in adjusting relative distances from margins, as well as the local features of decision boundaries, which impact robustness.

Keywords: DNN robustness, decision boundary, local curvature, network complexity

Procedia PDF Downloads 67
14269 Application of Electrical Resistivity, Induced Polarization and Statistical Methods in Chichak Iron Deposit Exploration

Authors: Shahrzad Maghsoodi, Hamid Reza Ranazi

Abstract:

This paper is devoted to exploration of Chichak (hematite) deposit, using electrical resistivity, chargeability and statistical methods. Chichak hematite deposit is located in Chichak area west Azarbaijan, northwest of Iran. There are some outcrops of hematite bodies in the area. The goal of this study was to identify the depth, thickness and shape of these bodies and to explore other probabile hematite bodies. Therefore nine profiles were considered to be surveyed by RS and IP method by utilizing an innovative electrode array so called CRSP (Combined Resistivity Sounding and Profiling). IP and RS sections were completed along each profile. In addition, the RS and IP data were analyzed and relation between these two variables was determined by statistical tools. Finally, hematite bodies were identified in each of the sections. The results showed that hematite bodies have a resistivity lower than 125 Ωm and very low chargeability, lower than 8 mV⁄V. After geophysical study some points were proposed for drilling, results obtained from drilling confirm the geophysical results.

Keywords: Hematite deposit, Iron exploration, Electrical resistivity, Chargeability, Iran, Chichak, Statistical, CRSP electrodes array

Procedia PDF Downloads 68
14268 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 81
14267 Simulation Analysis and Control of the Temperature Field in an Induction Furnace Based on Various Parameters

Authors: Sohaibullah Zarghoon, Syed Yousaf, Cyril Belavy, Stanislav Duris, Samuel Emebu, Radek Matusu

Abstract:

Induction heating is extensively employed in industrial furnaces due to its swift response and high energy efficiency. Designing and optimising these furnaces necessitates the use of computer-aided simulations. This study aims to develop an accurate temperature field model for a rectangular steel billet in an induction furnace by leveraging various parameters in COMSOL Multiphysics software. The simulation analysis incorporated temperature dynamics, considering skin depth, temperature-dependent, and constant parameters of the steel billet. The resulting data-driven model was transformed into a state-space model using MATLAB's System Identification Toolbox for the purpose of designing a linear quadratic regulator (LQR). This controller was successfully implemented to regulate the core temperature of the billet from 1000°C to 1200°C, utilizing the distributed parameter system circuit.

Keywords: induction heating, LQR controller, skin depth, temperature field

Procedia PDF Downloads 26