Search results for: throughput analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28190

Search results for: throughput analysis

28010 Impact of Node Density and Transmission Range on the Performance of OLSR and DSDV Routing Protocols in VANET City Scenarios

Authors: Yassine Meraihi, Dalila Acheli, Rabah Meraihi

Abstract:

Vehicular Ad hoc Network (VANET) is a special case of Mobile Ad hoc Network (MANET) used to establish communications and exchange information among nearby vehicles and between vehicles and nearby fixed infrastructure. VANET is seen as a promising technology used to provide safety, efficiency, assistance and comfort to the road users. Routing is an important issue in Vehicular Ad Hoc Network to find and maintain communication between vehicles due to the highly dynamic topology, frequently disconnected network and mobility constraints. This paper evaluates the performance of two most popular proactive routing protocols OLSR and DSDV in real city traffic scenario on the basis of three metrics namely Packet delivery ratio, throughput and average end to end delay by varying vehicles density and transmission range.

Keywords: DSDV, OLSR, quality of service, routing protocols, VANET

Procedia PDF Downloads 471
28009 Ensuring Uniform Energy Consumption in Non-Deterministic Wireless Sensor Network to Protract Networks Lifetime

Authors: Vrince Vimal, Madhav J. Nigam

Abstract:

Wireless sensor networks have enticed much of the spotlight from researchers all around the world, owing to its extensive applicability in agricultural, industrial and military fields. Energy conservation node deployment stratagems play a notable role for active implementation of Wireless Sensor Networks. Clustering is the approach in wireless sensor networks which improves energy efficiency in the network. The clustering algorithm needs to have an optimum size and number of clusters, as clustering, if not implemented properly, cannot effectively increase the life of the network. In this paper, an algorithm has been proposed to address connectivity issues with the aim of ensuring the uniform energy consumption of nodes in every part of the network. The results obtained after simulation showed that the proposed algorithm has an edge over existing algorithms in terms of throughput and networks lifetime.

Keywords: Wireless Sensor network (WSN), Random Deployment, Clustering, Isolated Nodes, Networks Lifetime

Procedia PDF Downloads 337
28008 Development of Locally Fabricated Honey Extracting Machine

Authors: Akinfiresoye W. A., Olarewaju O. O., Okunola, Okunola I. O.

Abstract:

An indigenous honey-extracting machine was designed, fabricated and evaluated at the workshop of the department of Agricultural Technology, Federal Polytechnic, Ile-Oluji, Nigeria using locally available materials. It has the extraction unit, the presser, the honey collector and the frame. The harvested honeycomb is placed inside the cylindrical extraction unit with perforated holes. The press plate was then placed on the comb while the hydraulic press of 3 tons was placed on it, supported by the frame. The hydraulic press, which is manually operated, forces the oil out of the extraction chamber through the perforated holes into the honey collector positioned at the lowest part of the extraction chamber. The honey-extracting machine has an average throughput of 2.59 kg/min and an efficiency of about 91%. The cost of producing the honey extracting machine is NGN 31, 700: 00, thirty-one thousand and seven hundred nairas only or $70 at NGN 452.8 to a dollar. This cost is affordable to beekeepers and would-be honey entrepreneurs. The honey-extracting machine is easy to operate and maintain without any complex technical know-how.

Keywords: honey, extractor, cost, efficiency

Procedia PDF Downloads 78
28007 Avoiding Packet Drop for Improved through Put in the Multi-Hop Wireless N/W

Authors: Manish Kumar Rajak, Sanjay Gupta

Abstract:

Mobile ad hoc networks (MANETs) are infrastructure less and intercommunicate using single-hop and multi-hop paths. Network based congestion avoidance which involves managing the queues in the network devices is an integral part of any network. QoS: A set of service requirements that are met by the network while transferring a packet stream from a source to a destination. Especially in MANETs, packet loss results in increased overheads. This paper presents a new algorithm to avoid congestion using one or more queue on nodes and corresponding flow rate decided in advance for each node. When any node attains an initial value of queue then it sends this status to its downstream nodes which in turn uses the pre-decided flow rate of packet transfer to its upstream nodes. The flow rate on each node is adjusted according to the status received from its upstream nodes. This proposed algorithm uses the existing infrastructure to inform to other nodes about its current queue status.

Keywords: mesh networks, MANET, packet count, threshold, throughput

Procedia PDF Downloads 476
28006 Relay Node Selection Algorithm for Cooperative Communications in Wireless Networks

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards support multiple transmission rates. Even though the use of multiple transmission rates increase the WLAN capacity, this feature leads to the performance anomaly problem. Cooperative communication was introduced to relieve the performance anomaly problem. Data packets are delivered to the destination much faster through a relay node with high rate than through direct transmission to the destination at low rate. In the legacy cooperative protocols, a source node chooses a relay node only based on the transmission rate. Therefore, they are not so feasible in multi-flow environments since they do not consider the effect of other flows. To alleviate the effect, we propose a new relay node selection algorithm based on the transmission rate and channel contention level. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and delay.

Keywords: cooperative communications, MAC protocol, relay node, WLAN

Procedia PDF Downloads 334
28005 Fabrication of Uniform Nanofibers Using Gas Dynamic Virtual Nozzle Based Microfluidic Liquid Jet System

Authors: R. Vasireddi, J. Kruse, M. Vakili, M. Trebbin

Abstract:

Here we present a gas dynamic virtual nozzle (GDVN) based microfluidic jetting devices for spinning of nano/microfibers. The device is fabricated by soft lithography techniques and is based on the principle of a GDVN for precise three-dimensional gas focusing of the spinning solution. The nozzle device is used to produce micro/nanofibers of a perfluorinated terpolymer (THV), which were collected on an aluminum substrate for scanning electron microscopy (SEM) analysis. The influences of air pressure, polymer concentration, flow rate and nozzle geometry on the fiber properties were investigated. It was revealed that surface properties are controlled by air pressure and polymer concentration while the diameter and shape of the fibers are influenced mostly by the concentration of the polymer solution and pressure. Alterations of the nozzle geometry had a negligible effect on the fiber properties, however, the jetting stability was affected. Round and flat fibers with differing surface properties from craters, grooves to smooth surfaces could be fabricated by controlling the above-mentioned parameters. Furthermore, the formation of surface roughness was attributed to the fast evaporation rate and velocity (mis)match between the polymer solution jet and the surrounding air stream. The diameter of the fibers could be tuned from ~250 nm to ~15 µm. Because of the simplicity of the setup, the precise control of the fiber properties, access to biocompatible nanofiber fabrication and the easy scale-up of parallel channels for high throughput, this method offers significant benefits compared to existing solution-based fiber production methods.

Keywords: gas dynamic virtual nozzle (GDVN) principle, microfluidic device, spinning, uniform nanofibers

Procedia PDF Downloads 154
28004 Multi-Sender MAC Protocol Based on Temporal Reuse in Underwater Acoustic Networks

Authors: Dongwon Lee, Sunmyeng Kim

Abstract:

Underwater acoustic networks (UANs) have become a very active research area in recent years. Compared with wireless networks, UANs are characterized by the limited bandwidth, long propagation delay and high channel dynamic in acoustic modems, which pose challenges to the design of medium access control (MAC) protocol. The characteristics severely affect network performance. In this paper, we study a MS-MAC (Multi-Sender MAC) protocol in order to improve network performance. The proposed protocol exploits temporal reuse by learning the propagation delays to neighboring nodes. A source node locally calculates the transmission schedules of its neighboring nodes and itself based on the propagation delays to avoid collisions. Performance evaluation is conducted using simulation, and confirms that the proposed protocol significantly outperforms the previous protocol in terms of throughput.

Keywords: acoustic channel, MAC, temporal reuse, UAN

Procedia PDF Downloads 351
28003 Gene Expression Meta-Analysis of Potential Shared and Unique Pathways Between Autoimmune Diseases Under anti-TNFα Therapy

Authors: Charalabos Antonatos, Mariza Panoutsopoulou, Georgios K. Georgakilas, Evangelos Evangelou, Yiannis Vasilopoulos

Abstract:

The extended tissue damage and severe clinical outcomes of autoimmune diseases, accompanied by the high annual costs to the overall health care system, highlight the need for an efficient therapy. Increasing knowledge over the pathophysiology of specific chronic inflammatory diseases, namely Psoriasis (PsO), Inflammatory Bowel Diseases (IBD) consisting of Crohn’s disease (CD) and Ulcerative colitis (UC), and Rheumatoid Arthritis (RA), has provided insights into the underlying mechanisms that lead to the maintenance of the inflammation, such as Tumor Necrosis Factor alpha (TNF-α). Hence, the anti-TNFα biological agents pose as an ideal therapeutic approach. Despite the efficacy of anti-TNFα agents, several clinical trials have shown that 20-40% of patients do not respond to treatment. Nowadays, high-throughput technologies have been recruited in order to elucidate the complex interactions in multifactorial phenotypes, with the most ubiquitous ones referring to transcriptome quantification analyses. In this context, a random effects meta-analysis of available gene expression cDNA microarray datasets was performed between responders and non-responders to anti-TNFα therapy in patients with IBD, PsO, and RA. Publicly available datasets were systematically searched from inception to 10th of November 2020 and selected for further analysis if they assessed the response to anti-TNFα therapy with clinical score indexes from inflamed biopsies. Specifically, 4 IBD (79 responders/72 non-responders), 3 PsO (40 responders/11 non-responders) and 2 RA (16 responders/6 non-responders) datasetswere selected. After the separate pre-processing of each dataset, 4 separate meta-analyses were conducted; three disease-specific and a single combined meta-analysis on the disease-specific results. The MetaVolcano R package (v.1.8.0) was utilized for a random-effects meta-analysis through theRestricted Maximum Likelihood (RELM) method. The top 1% of the most consistently perturbed genes in the included datasets was highlighted through the TopConfects approach while maintaining a 5% False Discovery Rate (FDR). Genes were considered as Differentialy Expressed (DEGs) as those with P ≤ 0.05, |log2(FC)| ≥ log2(1.25) and perturbed in at least 75% of the included datasets. Over-representation analysis was performed using Gene Ontology and Reactome Pathways for both up- and down-regulated genes in all 4 performed meta-analyses. Protein-Protein interaction networks were also incorporated in the subsequentanalyses with STRING v11.5 and Cytoscape v3.9. Disease-specific meta-analyses detected multiple distinct pro-inflammatory and immune-related down-regulated genes for each disease, such asNFKBIA, IL36, and IRAK1, respectively. Pathway analyses revealed unique and shared pathways between each disease, such as Neutrophil Degranulation and Signaling by Interleukins. The combined meta-analysis unveiled 436 DEGs, 86 out of which were up- and 350 down-regulated, confirming the aforementioned shared pathways and genes, as well as uncovering genes that participate in anti-inflammatory pathways, namely IL-10 signaling. The identification of key biological pathways and regulatory elements is imperative for the accurate prediction of the patient’s response to biological drugs. Meta-analysis of such gene expression data could aid the challenging approach to unravel the complex interactions implicated in the response to anti-TNFα therapy in patients with PsO, IBD, and RA, as well as distinguish gene clusters and pathways that are altered through this heterogeneous phenotype.

Keywords: anti-TNFα, autoimmune, meta-analysis, microarrays

Procedia PDF Downloads 183
28002 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees

Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.

Keywords: cloud storage, decision trees, diagnostic image, search, telemedicine

Procedia PDF Downloads 204
28001 Blockchain’s Feasibility in Military Data Networks

Authors: Brenden M. Shutt, Lubjana Beshaj, Paul L. Goethals, Ambrose Kam

Abstract:

Communication security is of particular interest to military data networks. A relatively novel approach to network security is blockchain, a cryptographically secured distribution ledger with a decentralized consensus mechanism for data transaction processing. Recent advances in blockchain technology have proposed new techniques for both data validation and trust management, as well as different frameworks for managing dataflow. The purpose of this work is to test the feasibility of different blockchain architectures as applied to military command and control networks. Various architectures are tested through discrete-event simulation and the feasibility is determined based upon a blockchain design’s ability to maintain long-term stable performance at industry standards of throughput, network latency, and security. This work proposes a consortium blockchain architecture with a computationally inexpensive consensus mechanism, one that leverages a Proof-of-Identity (PoI) concept and a reputation management mechanism.

Keywords: blockchain, consensus mechanism, discrete-event simulation, fog computing

Procedia PDF Downloads 139
28000 Performance Evaluation of Vertical Handover on Silom Line BTS

Authors: Silumpa Suboonsan, Suwat Pattaramalai

Abstract:

In this paper, the performance of internet usage by using Vertical Handover (VHO) between cellular network and wireless local area network (WLAN) on Silom line Bangkok Mass Transit System (BTS) is evaluated. In the evaluation model, there is the WLAN on every BTS station and there are cellular base stations along the BTS path. The maximum data rates for cellular network are 7.2, 14.4, 42, and 100Mbps and for WLAN are 54, 150, and 300Mbps. The simulation are based on users using internet, watching VDOs and browsing web pages, on the BTS train from first station to the last station (full time usage) and on the BTS train for traveling some number of stations (random time). The results shows that VHO system has throughput a lot more than using only cellular network when the data rate of WLAN is more than one of cellular network. Lastly, the number of watching HD VDO and Full HD VDO is higher on VHO system on both regular time and rush hour of BTS travelling.

Keywords: vertical handover, WLAN, cellular, silom line BTS

Procedia PDF Downloads 478
27999 A Memetic Algorithm for an Energy-Costs-Aware Flexible Job-Shop Scheduling Problem

Authors: Christian Böning, Henrik Prinzhorn, Eric C. Hund, Malte Stonis

Abstract:

In this article, the flexible job-shop scheduling problem is extended by consideration of energy costs which arise owing to the power peak, and further decision variables such as work in process and throughput time are incorporated into the objective function. This enables a production plan to be simultaneously optimized in respect of the real arising energy and logistics costs. The energy-costs-aware flexible job-shop scheduling problem (EFJSP) which arises is described mathematically, and a memetic algorithm (MA) is presented as a solution. In the MA, the evolutionary process is supplemented with a local search. Furthermore, repair procedures are used in order to rectify any infeasible solutions that have arisen in the evolutionary process. The potential for lowering the real arising costs of a production plan through consideration of energy consumption levels is highlighted.

Keywords: energy costs, flexible job-shop scheduling, memetic algorithm, power peak

Procedia PDF Downloads 346
27998 Utilizing Hybrid File Mapping for High-Performance I/O

Authors: Jaechun No

Abstract:

As the technology of NAND flash memory rapidly grows, SSD is becoming an excellent alternative for storage solutions, because of its high random I/O throughput and low power consumption. These SSD potentials have drawn great attention from IT enterprises that seek for better I/O performance. However, high SSD cost per capacity makes it less desirable to construct a large-scale storage subsystem solely composed of SSD devices. An alternative is to build a hybrid storage subsystem where both HDD and SSD devices are incorporated in an economic manner, while employing the strengths of both devices. This paper presents a hybrid file system, called hybridFS, that attempts to utilize the advantages of HDD and SSD devices, to provide a single, virtual address space by integrating both devices. HybridFS not only proposes an efficient implementation for the file management in the hybrid storage subsystem but also suggests an experimental framework for making use of the excellent features of existing file systems. Several performance evaluations were conducted to verify the effectiveness and suitability of hybridFS.

Keywords: hybrid file mapping, data layout, hybrid device integration, extent allocation

Procedia PDF Downloads 507
27997 Quantitative Polymerase Chain Reaction Analysis of Phytoplankton Composition and Abundance to Assess Eutrophication: A Multi-Year Study in Twelve Large Rivers across the United States

Authors: Chiqian Zhang, Kyle D. McIntosh, Nathan Sienkiewicz, Ian Struewing, Erin A. Stelzer, Jennifer L. Graham, Jingrang Lu

Abstract:

Phytoplankton plays an essential role in freshwater aquatic ecosystems and is the primary group synthesizing organic carbon and providing food sources or energy to ecosystems. Therefore, the identification and quantification of phytoplankton are important for estimating and assessing ecosystem productivity (carbon fixation), water quality, and eutrophication. Microscopy is the current gold standard for identifying and quantifying phytoplankton composition and abundance. However, microscopic analysis of phytoplankton is time-consuming, has a low sample throughput, and requires deep knowledge and rich experience in microbial morphology to implement. To improve this situation, quantitative polymerase chain reaction (qPCR) was considered for phytoplankton identification and quantification. Using qPCR to assess phytoplankton composition and abundance, however, has not been comprehensively evaluated. This study focused on: 1) conducting a comprehensive performance comparison of qPCR and microscopy techniques in identifying and quantifying phytoplankton and 2) examining the use of qPCR as a tool for assessing eutrophication. Twelve large rivers located throughout the United States were evaluated using data collected from 2017 to 2019 to understand the relation between qPCR-based phytoplankton abundance and eutrophication. This study revealed that temporal variation of phytoplankton abundance in the twelve rivers was limited within years (from late spring to late fall) and among different years (2017, 2018, and 2019). Midcontinent rivers had moderately greater phytoplankton abundance than eastern and western rivers, presumably because midcontinent rivers were more eutrophic. The study also showed that qPCR- and microscope-determined phytoplankton abundance had a significant positive linear correlation (adjusted R² 0.772, p-value < 0.001). In addition, phytoplankton abundance assessed via qPCR showed promise as an indicator of the eutrophication status of those rivers, with oligotrophic rivers having low phytoplankton abundance and eutrophic rivers having (relatively) high phytoplankton abundance. This study demonstrated that qPCR could serve as an alternative tool to traditional microscopy for phytoplankton quantification and eutrophication assessment in freshwater rivers.

Keywords: phytoplankton, eutrophication, river, qPCR, microscopy, spatiotemporal variation

Procedia PDF Downloads 101
27996 Meanings and Concepts of Standardization in Systems Medicine

Authors: Imme Petersen, Wiebke Sick, Regine Kollek

Abstract:

In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.

Keywords: data, science and technology studies (STS), standardization, systems medicine

Procedia PDF Downloads 342
27995 Assessing Significance of Correlation with Binomial Distribution

Authors: Vijay Kumar Singh, Pooja Kushwaha, Prabhat Ranjan, Krishna Kumar Ojha, Jitendra Kumar

Abstract:

Present day high-throughput genomic technologies, NGS/microarrays, are producing large volume of data that require improved analysis methods to make sense of the data. The correlation between genes and samples has been regularly used to gain insight into many biological phenomena including, but not limited to, co-expression/co-regulation, gene regulatory networks, clustering and pattern identification. However, presence of outliers and violation of assumptions underlying Pearson correlation is frequent and may distort the actual correlation between the genes and lead to spurious conclusions. Here, we report a method to measure the strength of association between genes. The method assumes that the expression values of a gene are Bernoulli random variables whose outcome depends on the sample being probed. The method considers the two genes as uncorrelated if the number of sample with same outcome for both the genes (Ns) is equal to certainly expected number (Es). The extent of correlation depends on how far Ns can deviate from the Es. The method does not assume normality for the parent population, fairly unaffected by the presence of outliers, can be applied to qualitative data and it uses the binomial distribution to assess the significance of association. At this stage, we would not claim about the superiority of the method over other existing correlation methods, but our method could be another way of calculating correlation in addition to existing methods. The method uses binomial distribution, which has not been used until yet, to assess the significance of association between two variables. We are evaluating the performance of our method on NGS/microarray data, which is noisy and pierce by the outliers, to see if our method can differentiate between spurious and actual correlation. While working with the method, it has not escaped our notice that the method could also be generalized to measure the association of more than two variables which has been proven difficult with the existing methods.

Keywords: binomial distribution, correlation, microarray, outliers, transcriptome

Procedia PDF Downloads 416
27994 A Rapid and Greener Analysis Approach Based on Carbonfiber Column System and MS Detection for Urine Metabolomic Study After Oral Administration of Food Supplements 

Authors: Zakia Fatima, Liu Lu, Donghao Li

Abstract:

The analysis of biological fluid metabolites holds significant importance in various areas, such as medical research, food science, and public health. Investigating the levels and distribution of nutrients and their metabolites in biological samples allows researchers and healthcare professionals to determine nutritional status, find hypovitaminosis or hypervitaminosis, and monitor the effectiveness of interventions such as dietary supplementation. Moreover, analysis of nutrient metabolites provides insight into their metabolism, bioavailability, and physiological processes, aiding in the clarification of their health roles. Hence, the exploration of a distinct, efficient, eco-friendly, and simpler methodology is of great importance to evaluate the metabolic content of complex biological samples. In this work, a green and rapid analytical method based on an automated online two-dimensional microscale carbon fiber/activated carbon fiber fractionation system and time-of-flight mass spectrometry (2DμCFs-TOF-MS) was used to evaluate metabolites of urine samples after oral administration of food supplements. The automated 2DμCFs instrument consisted of a microcolumn system with bare carbon fibers and modified carbon fiber coatings. Carbon fibers and modified carbon fibers exhibit different surface characteristics and retain different compounds accordingly. Three kinds of mobile-phase solvents were used to elute the compounds of varied chemical heterogeneities. The 2DμCFs separation system has the ability to effectively separate different compounds based on their polarity and solubility characteristics. No complicated sample preparation method was used prior to analysis, which makes the strategy more eco-friendly, practical, and faster than traditional analysis methods. For optimum analysis results, mobile phase composition, flow rate, and sample diluent were optimized. Water-soluble vitamins, fat-soluble vitamins, and amino acids, as well as 22 vitamin metabolites and 11 vitamin metabolic pathway-related metabolites, were found in urine samples. All water-soluble vitamins except vitamin B12 and vitamin B9 were detected in urine samples. However, no fat-soluble vitamin was detected, and only one metabolite of Vitamin A was found. The comparison with a blank urine sample showed a considerable difference in metabolite content. For example, vitamin metabolites and three related metabolites were not detected in blank urine. The complete single-run screening was carried out in 5.5 minutes with the minimum consumption of toxic organic solvent (0.5 ml). The analytical method was evaluated in terms of greenness, with an analytical greenness (AGREE) score of 0.72. The method’s practicality has been investigated using the Blue Applicability Grade Index (BAGI) tool, obtaining a score of 77. The findings in this work illustrated that the 2DµCFs-TOF-MS approach could emerge as a fast, sustainable, practical, high-throughput, and promising analytical tool for screening and accurate detection of various metabolites, pharmaceuticals, and ingredients in dietary supplements as well as biological fluids.

Keywords: metabolite analysis, sustainability, carbon fibers, urine.

Procedia PDF Downloads 29
27993 On the Other Side of Shining Mercury: In Silico Prediction of Cold Stabilizing Mutations in Serine Endopeptidase from Bacillus lentus

Authors: Debamitra Chakravorty, Pratap K. Parida

Abstract:

Cold-adapted proteases enhance wash performance in low-temperature laundry resulting in a reduction in energy consumption and wear of textiles and are also used in the dehairing process in leather industries. Unfortunately, the possible drawbacks of using cold-adapted proteases are their instability at higher temperatures. Therefore, proteases with broad temperature stability are required. Unfortunately, wild-type cold-adapted proteases exhibit instability at higher temperatures and thus have low shelf lives. Therefore, attempts to engineer cold-adapted proteases by protein engineering were made previously by directed evolution and random mutagenesis. The lacuna is the time, capital, and labour involved to obtain these variants are very demanding and challenging. Therefore, rational engineering for cold stability without compromising an enzyme's optimum pH and temperature for activity is the current requirement. In this work, mutations were rationally designed with the aid of high throughput computational methodology of network analysis, evolutionary conservation scores, and molecular dynamics simulations for Savinase from Bacillus lentus with the intention of rendering the mutants cold stable without affecting their temperature and pH optimum for activity. Further, an attempt was made to incorporate a mutation in the most stable mutant rationally obtained by this method to introduce oxidative stability in the mutant. Such enzymes are desired in detergents with bleaching agents. In silico analysis by performing 300 ns molecular dynamics simulations at 5 different temperatures revealed that these three mutants were found to be better in cold stability compared to the wild type Savinase from Bacillus lentus. Conclusively, this work shows that cold adaptation without losing optimum temperature and pH stability and additionally stability from oxidative damage can be rationally designed by in silico enzyme engineering. The key findings of this work were first, the in silico data of H5 (cold stable savinase) used as a control in this work, corroborated with its reported wet lab temperature stability data. Secondly, three cold stable mutants of Savinase from Bacillus lentus were rationally identified. Lastly, a mutation which will stabilize savinase against oxidative damage was additionally identified.

Keywords: cold stability, molecular dynamics simulations, protein engineering, rational design

Procedia PDF Downloads 140
27992 Cas9-Assisted Direct Cloning and Refactoring of a Silent Biosynthetic Gene Cluster

Authors: Peng Hou

Abstract:

Natural products produced from marine bacteria serve as an immense reservoir for anti-infective drugs and therapeutic agents. Nowadays, heterologous expression of gene clusters of interests has been widely adopted as an effective strategy for natural product discovery. Briefly, the heterologous expression flowchart would be: biosynthetic gene cluster identification, pathway construction and expression, and product detection. However, gene cluster capture using traditional Transformation-associated recombination (TAR) protocol is low-efficient (0.5% positive colony rate). To make things worse, most of these putative new natural products are only predicted by bioinformatics analysis such as antiSMASH, and their corresponding natural products biosynthetic pathways are either not expressed or expressed at very low levels under laboratory conditions. Those setbacks have inspired us to focus on seeking new technologies to efficiently edit and refractor of biosynthetic gene clusters. Recently, two cutting-edge techniques have attracted our attention - the CRISPR-Cas9 and Gibson Assembly. By now, we have tried to pretreat Brevibacillus laterosporus strain genomic DNA with CRISPR-Cas9 nucleases that specifically generated breaks near the gene cluster of interest. This trial resulted in an increase in the efficiency of gene cluster capture (9%). Moreover, using Gibson Assembly by adding/deleting certain operon and tailoring enzymes regardless of end compatibility, the silent construct (~80kb) has been successfully refactored into an active one, yielded a series of analogs expected. With the appearances of the novel molecular tools, we are confident to believe that development of a high throughput mature pipeline for DNA assembly, transformation, product isolation and identification would no longer be a daydream for marine natural product discovery.

Keywords: biosynthesis, CRISPR-Cas9, DNA assembly, refactor, TAR cloning

Procedia PDF Downloads 283
27991 Automated Buffer Box Assembly Cell Concept for the Canadian Used Fuel Packing Plant

Authors: Dimitrie Marinceu, Alan Murchison

Abstract:

The Canadian Used Fuel Container (UFC) is a mid-size hemispherical headed copper coated steel container measuring 2.5 meters in length and 0.5 meters in diameter containing 48 used fuel bundles. The contained used fuel produces significant gamma radiation requiring automated assembly processes to complete the assembly. The design throughput of 2,500 UFCs per year places constraints on equipment and hot cell design for repeatability, speed of processing, robustness and recovery from upset conditions. After UFC assembly, the UFC is inserted into a Buffer Box (BB). The BB is made from adequately pre-shaped blocks (lower and upper block) and Highly Compacted Bentonite (HCB) material. The blocks are practically ‘sandwiching’ the UFC between them after assembly. This paper identifies one possible approach for the BB automatic assembly cell and processes. Automation of the BB assembly will have a significant positive impact on nuclear safety, quality, productivity, and reliability.

Keywords: used fuel packing plant, automatic assembly cell, used fuel container, buffer box, deep geological repository

Procedia PDF Downloads 275
27990 Experimental Research on Neck Thinning Dynamics of Droplets in Cross Junction Microchannels

Authors: Yilin Ma, Zhaomiao Liu, Xiang Wang, Yan Pang

Abstract:

Microscale droplets play an increasingly important role in various applications, including medical diagnostics, material synthesis, chemical engineering, and cell research due to features of high surface-to-volume ratio and tiny scale, which can significantly improve reaction rates, enhance heat transfer efficiency, enable high-throughput parallel studies as well as reduce reagent usage. As a mature technique to manipulate small amounts of liquids, droplet microfluidics could achieve the precise control of droplet parameters such as size, uniformity, structure, and thus has been widely adopted in the engineering and scientific research of multiple fields. Necking processes of the droplet in the cross junction microchannels are experimentally and theoretically investigated and dynamic mechanisms of the neck thinning in two different regimes are revealed. According to evolutions of the minimum neck width and the thinning rate, the necking process is further divided into different stages and the main driving force during each stage is confirmed. Effects of the flow rates and the cross-sectional aspect ratio on the necking process as well as the neck profile at different stages are provided in detail. The distinct features of the two regimes in the squeezing stage are well captured by the theoretical estimations of the effective flow rate and the variations of the actual flow rates in different channels are reasonably reflected by the channel width ratio. In the collapsing stage, the quantitative relation between the minimum neck width and the remaining time is constructed to identify the physical mechanism.

Keywords: cross junction, neck thinning, force analysis, inertial mechanism

Procedia PDF Downloads 110
27989 Development of a Methodology for Surgery Planning and Control: A Management Approach to Handle the Conflict of High Utilization and Low Overtime

Authors: Timo Miebach, Kirsten Hoeper, Carolin Felix

Abstract:

In times of competitive pressures and demographic change, hospitals have to reconsider their strategies as a company. Due to the fact, that operations are one of the main income and one of the primary cost drivers otherwise, a process-oriented approach and an efficient use of resources seems to be the right way for getting a consistent market position. Thus, the efficient operation room occupancy planning is an important cause variable for the success and continued the existence of these institutions. A high utilization of resources is essential. This means a very high, but nevertheless sensible capacity-oriented utilization of working systems that can be realized by avoiding downtimes and a thoughtful occupancy planning. This engineering approach should help hospitals to reach her break-even point. Firstly, the aim is to establish a strategy point, which can be used for the generation of a planned throughput time. Secondly, the operation planning and control should be facilitated and implemented accurately by the generation of time modules. More than 100,000 data records of the Hannover Medical School were analyzed. The data records contain information about the type of conducted operation, the duration of the individual process steps, and all other organizational-specific data such as an operating room. Based on the aforementioned data base, a generally valid model was developed by an analysis to define a strategy point which takes the conflict of capacity utilization and low overtime into account. Furthermore, time modules were generated in this work, which allows a simplified and flexible operation planning and control for the operation manager. By the time modules, it is possible to reduce a high average value of the idle times of the operation rooms. Furthermore, the potential is used to minimize the idle time spread.

Keywords: capacity, operating room, surgery planning and control, utilization

Procedia PDF Downloads 253
27988 Design of a Telemetry, Tracking, and Command Radio-Frequency Receiver for Small Satellites Based on Commercial Off-The-Shelf Components

Authors: A. Lovascio, A. D’Orazio, V. Centonze

Abstract:

From several years till now the aerospace industry is developing more and more small satellites for Low-Earth Orbit (LEO) missions. Such satellites have a low cost of making and launching since they have a size and weight smaller than other types of satellites. However, because of size limitations, small satellites need integrated electronic equipment based on digital logic. Moreover, the LEOs require telecommunication modules with high throughput to transmit to earth a big amount of data in a short time. In order to meet such requirements, in this paper we propose a Telemetry, Tracking & Command module optimized through the use of the Commercial Off-The-Shelf components. The proposed approach exploits the major flexibility offered by these components in reducing costs and optimizing the performance. The method has been applied in detail for the design of the front-end receiver, which has a low noise figure (1.5 dB) and DC power consumption (smaller than 2 W). Such a performance is particularly attractive since it allows fulfilling the energy budget stringent constraints that are typical for LEO small platforms.

Keywords: COTS, LEO, small-satellite, TT&C

Procedia PDF Downloads 131
27987 Metagenomics-Based Molecular Epidemiology of Viral Diseases

Authors: Vyacheslav Furtak, Merja Roivainen, Olga Mirochnichenko, Majid Laassri, Bella Bidzhieva, Tatiana Zagorodnyaya, Vladimir Chizhikov, Konstantin Chumakov

Abstract:

Molecular epidemiology and environmental surveillance are parts of a rational strategy to control infectious diseases. They have been widely used in the worldwide campaign to eradicate poliomyelitis, which otherwise would be complicated by the inability to rapidly respond to outbreaks and determine sources of the infection. The conventional scheme involves isolation of viruses from patients and the environment, followed by their identification by nucleotide sequences analysis to determine phylogenetic relationships. This is a tedious and time-consuming process that yields definitive results when it may be too late to implement countermeasures. Because of the difficulty of high-throughput full-genome sequencing, most such studies are conducted by sequencing only capsid genes or their parts. Therefore the important information about the contribution of other parts of the genome and inter- and intra-species recombination to viral evolution is not captured. Here we propose a new approach based on the rapid concentration of sewage samples with tangential flow filtration followed by deep sequencing and reconstruction of nucleotide sequences of viruses present in the samples. The entire nucleic acids content of each sample is sequenced, thus preserving in digital format the complete spectrum of viruses. A set of rapid algorithms was developed to separate deep sequence reads into discrete populations corresponding to each virus and assemble them into full-length consensus contigs, as well as to generate a complete profile of sequence heterogeneities in each of them. This provides an effective approach to study molecular epidemiology and evolution of natural viral populations.

Keywords: poliovirus, eradication, environmental surveillance, laboratory diagnosis

Procedia PDF Downloads 282
27986 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker

Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán

Abstract:

The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.

Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation

Procedia PDF Downloads 28
27985 Smart Production Planning: The Case of Aluminium Foundry

Authors: Samira Alvandi

Abstract:

In the context of the circular economy, production planning aims to eliminate waste and emissions and maximize resource efficiency. Historically production planning is challenged through arrays of uncertainty and complexity arising from the interdependence and variability of products, processes, and systems. Manufacturers worldwide are facing new challenges in tackling various environmental issues such as climate change, resource depletion, and land degradation. In managing the inherited complexity and uncertainty and yet maintaining profitability, the manufacturing sector is in need of a holistic framework that supports energy efficiency and carbon emission reduction schemes. The proposed framework addresses the current challenges and integrates simulation modeling with optimization for finding optimal machine-job allocation to maximize throughput and total energy consumption while minimizing lead time. The aluminium refinery facility in western Sydney, Australia, is used as an exemplar to validate the proposed framework.

Keywords: smart production planning, simulation-optimisation, energy aware capacity planning, energy intensive industries

Procedia PDF Downloads 77
27984 A Hybrid Derivative-Free Optimization Method for Pass Schedule Calculation in Cold Rolling Mill

Authors: Mohammadhadi Mirmohammadi, Reza Safian, Hossein Haddad

Abstract:

This paper presents an innovative solution for complex multi-objective optimization problem which is a part of efforts toward maximizing rolling mill throughput and minimizing processing costs in tandem cold rolling. This computational intelligence based optimization has been applied to the rolling schedules of tandem cold rolling mill. This method involves the combination of two derivative-free optimization procedures in the form of nested loops. The first optimization loop is based on Improving Hit and Run method which focus on balance of power, force and reduction distribution in rolling schedules. The second loop is a real-coded genetic algorithm based optimization procedure which optimizes energy consumption and productivity. An experimental result of application to five stand tandem cold rolling mill is presented.

Keywords: derivative-free optimization, Improving Hit and Run method, real-coded genetic algorithm, rolling schedules of tandem cold rolling mill

Procedia PDF Downloads 700
27983 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline

Authors: Leo Nnamdi Ozurumba-Dwight

Abstract:

Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.

Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.

Procedia PDF Downloads 123
27982 The Scenario Analysis of Shale Gas Development in China by Applying Natural Gas Pipeline Optimization Model

Authors: Meng Xu, Alexis K. H. Lau, Ming Xu, Bill Barron, Narges Shahraki

Abstract:

As an emerging unconventional energy, shale gas has been an economically viable step towards a cleaner energy future in U.S. China also has shale resources that are estimated to be potentially the largest in the world. In addition, China has enormous unmet for a clean alternative to substitute coal. Nonetheless, the geological complexity of China’s shale basins and issues of water scarcity potentially impose serious constraints on shale gas development in China. Further, even if China could replicate to a significant degree the U.S. shale gas boom, China faces the problem of transporting the gas efficiently overland with its limited pipeline network throughput capacity and coverage. The aim of this study is to identify the potential bottlenecks in China’s gas transmission network, as well as to examine the shale gas development affecting particular supply locations and demand centers. We examine this through application of three scenarios with projecting domestic shale gas supply by 2020: optimistic, medium and conservative shale gas supply, taking references from the International Energy Agency’s (IEA’s) projections and China’s shale gas development plans. Separately we project the gas demand at provincial level, since shale gas will have more significant impact regionally than nationally. To quantitatively assess each shale gas development scenario, we formulated a gas pipeline optimization model. We used ArcGIS to generate the connectivity parameters and pipeline segment length. Other parameters are collected from provincial “twelfth-five year” plans and “China Oil and Gas Pipeline Atlas”. The multi-objective optimization model uses GAMs and Matlab. It aims to minimize the demands that are unable to be met, while simultaneously seeking to minimize total gas supply and transmission costs. The results indicate that, even if the primary objective is to meet the projected gas demand rather than cost minimization, there’s a shortfall of 9% in meeting total demand under the medium scenario. Comparing the results between the optimistic and medium supply of shale gas scenarios, almost half of the shale gas produced in Sichuan province and Chongqing won’t be able to be transmitted out by pipeline. On the demand side, the Henan province and Shanghai gas demand gap could be filled as much as 82% and 39% respectively, with increased shale gas supply. To conclude, the pipeline network in China is currently not sufficient in meeting the projected natural gas demand in 2020 under medium and optimistic scenarios, indicating the need for substantial pipeline capacity expansion for some of the existing network, and the importance of constructing new pipelines from particular supply to demand sites. If the pipeline constraint is overcame, Beijing, Shanghai, Jiangsu and Henan’s gas demand gap could potentially be filled, and China could thereby reduce almost 25% its dependency on LNG imports under the optimistic scenario.

Keywords: energy policy, energy systematic analysis, scenario analysis, shale gas in China

Procedia PDF Downloads 288
27981 Novel Ultrasensitive Point of Care Device for Diagnosis of Human Schistosomiasis Mansoni

Authors: Ibrahim Aly, Waleed Elawamy, Hanan Taher, Amira Matar

Abstract:

Schistosomiasis is infection with blood flukes of the genus Schistosoma, which are acquired trans-cutaneously by swimming or wading in contaminated freshwater. The present study was proposed to produce ultra-sensitive, field-friendly high-throughput rapid immunochromatography diagnostic device for accurate detection of asymptomatic parasite carriers in schistosomiasis pre-elimination settings.For assessing diagnostic potential of rapid device, 50 blood samples from patients with schistosomiasis mansoni, 29 other proven parasitic diseases and 25 blood samples as negative control were from healthy individuals were used. The sensitivity of Quantitative antigen-capture nano-ELISAwas 82 %, and specificity was 87.1 %, where the sensitivity of Nano Dot- ELISA was 86 % and specificity was 90.7 %. The sensitivity of diagnostic device was 78 % and specificity was 85.2 %, with PPV and NPV of 86.2 % and 83.1 %, respectively.The Point of care device resulted in a good performance for the diagnosis of low-intensity infections, it was able to identify 19 out of 25 (76 %) individuals with ⩽7 eggs, 10 out of 14 individuals (71.4 %) with 11–99 eggs and 100 % of individuals with 100–399 eggs.

Keywords: schistosomiasis, immunochromatography, naon-dot-ELISa, diagnostis device

Procedia PDF Downloads 76