Search results for: real number sequences
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14605

Search results for: real number sequences

14365 Theoretical Approach and Proof of Concept Implementation of Adaptive Partition Scheduling Module for Linux

Authors: Desislav Andreev, Veselin Stanev

Abstract:

Linux operating system continues to gain popularity with every passed year. This is due to its open-source license and a great number of distributions, covering users’ needs. At first glance it seems that Linux can be integrated in every type of systems – it is already present in personal computers, smartphones and even in some embedded systems like Raspberry Pi. However, Linux still does not meet the performance and security requirements to run effectively on a real-time system. Real-time systems are very time-restricted – their processes have to execute and finish at strict time intervals. The Completely Fair Scheduler present in Linux does not have such scheduling capabilities and it is not able to ensure that critical-time processes will execute on time. One of the ways to solve this problem is implementing an Adaptive Partition Scheduler solution similar to that present in QNX Neutrino operating system. This type of scheduling divides the CPU in multiple adaptive partitions where each partition holds a percentage of CPU usage called budget, which allows optimal usage of the CPU resources and also provides protection against cyber attacks such as Denial of Service. This approach will also benefit systems, where functional safety is highly demanded, such as the instrumental clusters in the Automotive industry. The purpose of this paper is to present a concept of Adaptive Partition Scheduler designed for Linux-based operating systems.

Keywords: adaptive partitions, Linux kernel modules, real-time systems, scheduling

Procedia PDF Downloads 71
14364 Legal Warranty in Real Estate Registry in Albania

Authors: Elona Saliaj

Abstract:

The registration of real estate in Albania after the 90's has been a long process in time and with high cost for the country. Passing the registration system from a centralized system to a free market private system, it’s accompanied by legal uncertainties that have led to economic instability. The reforms that have been undertaken in terms of property rights have been numerous and continuous throughout the years. But despite the reforms, the system of registration of real estate, has failed to be standards requirements established by the European Union. The completion of initial registration of real estate, legal treatment of previous owners or legalization of illegal constructions remain among the main problems that prevent the development of the country in its economic sector. The performance of the registration of real estate system and dealing with issues that have appeared in the Court of First Instance, the civil section of the Albanian constitute the core of handling this analysis. This paper presents a detailed analysis on the registration system that is chosen to be applied in our country for real estate. In its content it is also determined the institution that administrates these properties, the management technique and the law that determinate its functionality. The strategy is determined for creating a modern and functional registration system and for the country remains a challenge to achieve. Identifying practical problems and providing their solutions are also the focus of reference in order to improve and modernize this important system to a state law that aims to become a member of the European Union.

Keywords: real estates registration system, comparative aspects, cadastral area, property certificate, legal reform

Procedia PDF Downloads 465
14363 Real Time Detection, Prediction and Reconstitution of Rain Drops

Authors: R. Burahee, B. Chassinat, T. de Laclos, A. Dépée, A. Sastim

Abstract:

The purpose of this paper is to propose a solution to detect, predict and reconstitute rain drops in real time – during the night – using an embedded material with an infrared camera. To prevent the system from needing too high hardware resources, simple models are considered in a powerful image treatment algorithm reducing considerably calculation time in OpenCV software. Using a smart model – drops will be matched thanks to a process running through two consecutive pictures for implementing a sophisticated tracking system. With this system drops computed trajectory gives information for predicting their future location. Thanks to this technique, treatment part can be reduced. The hardware system composed by a Raspberry Pi is optimized to host efficiently this code for real time execution.

Keywords: reconstitution, prediction, detection, rain drop, real time, raspberry, infrared

Procedia PDF Downloads 383
14362 Predicting Potential Protein Therapeutic Candidates from the Gut Microbiome

Authors: Prasanna Ramachandran, Kareem Graham, Helena Kiefel, Sunit Jain, Todd DeSantis

Abstract:

Microbes that reside inside the mammalian GI tract, commonly referred to as the gut microbiome, have been shown to have therapeutic effects in animal models of disease. We hypothesize that specific proteins produced by these microbes are responsible for this activity and may be used directly as therapeutics. To speed up the discovery of these key proteins from the big-data metagenomics, we have applied machine learning techniques. Using amino acid sequences of known epitopes and their corresponding binding partners, protein interaction descriptors (PID) were calculated, making a positive interaction set. A negative interaction dataset was calculated using sequences of proteins known not to interact with these same binding partners. Using Random Forest and positive and negative PID, a machine learning model was trained and used to predict interacting versus non-interacting proteins. Furthermore, the continuous variable, cosine similarity in the interaction descriptors was used to rank bacterial therapeutic candidates. Laboratory binding assays were conducted to test the candidates for their potential as therapeutics. Results from binding assays reveal the accuracy of the machine learning prediction and are subsequently used to further improve the model.

Keywords: protein-interactions, machine-learning, metagenomics, microbiome

Procedia PDF Downloads 340
14361 Mixed Convective Heat Transfer in Water-Based Al2O3 Nanofluid in Horizontal Rectangular Duct

Authors: Nur Irmawati, H. A. Mohammed

Abstract:

In the present study, mixed convection in a horizontal rectangular duct using Al2O3 is numerically investigated. The effects of different Rayleigh number, Reynolds number and radiation on flow and heat transfer characteristics were studied in detail. This study covers Rayleigh number in the range of 2×106≤Ra≤2×107 and Reynolds number in the range of 100≤Re≤1100. Results reveal that the Nusselt number increases as Reynolds and Rayleigh numbers increase. It was also found that the dimensionless temperature distribution increases as Rayleigh number increases.

Keywords: numerical simulation, mixed convection, horizontal rectangular duct, nanofluids

Procedia PDF Downloads 342
14360 Design of a Real Time Closed Loop Simulation Test Bed on a General Purpose Operating System: Practical Approaches

Authors: Pratibha Srivastava, Chithra V. J., Sudhakar S., Nitin K. D.

Abstract:

A closed-loop system comprises of a controller, a response system, and an actuating system. The controller, which is the system under test for us, excites the actuators based on feedback from the sensors in a periodic manner. The sensors should provide the feedback to the System Under Test (SUT) within a deterministic time post excitation of the actuators. Any delay or miss in the generation of response or acquisition of excitation pulses may lead to control loop controller computation errors, which can be catastrophic in certain cases. Such systems categorised as hard real-time systems that need special strategies. The real-time operating systems available in the market may be the best solutions for such kind of simulations, but they pose limitations like the availability of the X Windows system, graphical interfaces, other user tools. In this paper, we present strategies that can be used on a general purpose operating system (Bare Linux Kernel) to achieve a deterministic deadline and hence have the added advantages of a GPOS with real-time features. Techniques shall be discussed how to make the time-critical application run with the highest priority in an uninterrupted manner, reduced network latency for distributed architecture, real-time data acquisition, data storage, and retrieval, user interactions, etc.

Keywords: real time data acquisition, real time kernel preemption, scheduling, network latency

Procedia PDF Downloads 108
14359 Virtualization and Visualization Based Driver Configuration in Operating System

Authors: Pavan Shah

Abstract:

In an Embedded system, Virtualization and visualization technology can provide us an effective response and measurable work in a software development environment. In addition to work of virtualization and virtualization can be easily deserved to provide the best resource sharing between real-time hardware applications and a healthy environment. However, the virtualization is noticeable work to minimize the I/O work and utilize virtualization & virtualization technology for either a software development environment (SDE) or a runtime environment of real-time embedded systems (RTMES) or real-time operating system (RTOS) eras. In this Paper, we particularly focus on virtualization and visualization overheads data of network which generates the I/O and implementation of standardized I/O (i.e., Virto), which can work as front-end network driver in a real-time operating system (RTOS) hardware module. Even there have been several work studies are available based on the virtualization operating system environment, but for the Virto on a general-purpose OS, my implementation is on the open-source Virto for a real-time operating system (RTOS). In this paper, the measurement results show that implementation which can improve the bandwidth and latency of memory management of the real-time operating system environment (RTMES) for getting more accuracy of the trained model.

Keywords: virtualization, visualization, network driver, operating system

Procedia PDF Downloads 106
14358 Real-Time Control of Grid-Connected Inverter Based on labVIEW

Authors: L. Benbaouche, H. E. , F. Krim

Abstract:

In this paper we propose real-time control of grid-connected single phase inverter, which is flexible and efficient. The first step is devoted to the study and design of the controller through simulation, conducted by the LabVIEW software on the computer 'host'. The second step is running the application from PXI 'target'. LabVIEW software, combined with NI-DAQmx, gives the tools to easily build applications using the digital to analog converter to generate the PWM control signals. Experimental results show that the effectiveness of LabVIEW software applied to power electronics.

Keywords: real-time control, labview, inverter, PWM

Procedia PDF Downloads 475
14357 Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations of previous approaches, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with attention mechanism. In a previous work on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: transformers, generative ai, gene expression design, classification

Procedia PDF Downloads 30
14356 An Analysis of Non-Elliptic Curve Based Primality Tests

Authors: William Wong, Zakaria Alomari, Hon Ching Lai, Zhida Li

Abstract:

Modern-day information security depends on implementing Diffie-Hellman, which requires the generation of prime numbers. Because the number of primes is infinite, it is impractical to store prime numbers for use, and therefore, primality tests are indispensable in modern-day information security. A primality test is a test to determine whether a number is prime or composite. There are two types of primality tests, which are deterministic tests and probabilistic tests. Deterministic tests are adopting algorithms that provide a definite answer whether a given number is prime or composite. While in probabilistic tests, a probabilistic result would be provided, there is a degree of uncertainty. In this paper, we review three probabilistic tests: the Fermat Primality Test, the Miller-Rabin Test, and the Baillie-PSW Test, as well as one deterministic test, the Agrawal-Kayal-Saxena (AKS) Test. Furthermore, we do an analysis of these tests. All of the reviews discussed are not based on the Elliptic Curve. The analysis demonstrates that, in the majority of real-world scenarios, the Baillie- PSW test’s favorability stems from its typical operational complexity of O(log 3n) and its capacity to deliver accurate results for numbers below 2^64.

Keywords: primality tests, Fermat’s primality test, Miller-Rabin primality test, Baillie-PSW primality test, AKS primality test

Procedia PDF Downloads 55
14355 Timetabling for Interconnected LRT Lines: A Package Solution Based on a Real-world Case

Authors: Huazhen Lin, Ruihua Xu, Zhibin Jiang

Abstract:

In this real-world case, timetabling the LRT network as a whole is rather challenging for the operator: they are supposed to create a timetable to avoid various route conflicts manually while satisfying a given interval and the number of rolling stocks, but the outcome is not satisfying. Therefore, the operator adopts a computerised timetabling tool, the Train Plan Maker (TPM), to cope with this problem. However, with various constraints in the dual-line network, it is still difficult to find an adequate pairing of turnback time, interval and rolling stocks’ number, which requires extra manual intervention. Aiming at current problems, a one-off model for timetabling is presented in this paper to simplify the procedure of timetabling. Before the timetabling procedure starts, this paper presents how the dual-line system with a ring and several branches is turned into a simpler structure. Then, a non-linear programming model is presented in two stages. In the first stage, the model sets a series of constraints aiming to calculate a proper timing for coordinating two lines by adjusting the turnback time at termini. Then, based on the result of the first stage, the model introduces a series of inequality constraints to avoid various route conflicts. With this model, an analysis is conducted to reveal the relation between the ratio of trains in different directions and the possible minimum interval, observing that the more imbalance the ratio is, the less possible to provide frequent service under such strict constraints.

Keywords: light rail transit (LRT), non-linear programming, railway timetabling, timetable coordination

Procedia PDF Downloads 37
14354 Efficiency of DMUs in Presence of New Inputs and Outputs in DEA

Authors: Esmat Noroozi, Elahe Sarfi, Farha Hosseinzadeh Lotfi

Abstract:

Examining the impacts of data modification is considered as sensitivity analysis. A lot of studies have considered the data modification of inputs and outputs in DEA. The issues which has not heretofore been considered in DEA sensitivity analysis is modification in the number of inputs and (or) outputs and determining the impacts of this modification in the status of efficiency of DMUs. This paper is going to present systems that show the impacts of adding one or multiple inputs or outputs on the status of efficiency of DMUs and furthermore a model is presented for recognizing the minimum number of inputs and (or) outputs from among specified inputs and outputs which can be added whereas an inefficient DMU will become efficient. Finally the presented systems and model have been utilized for a set of real data and the results have been reported.

Keywords: data envelopment analysis, efficiency, sensitivity analysis, input, out put

Procedia PDF Downloads 419
14353 Determining Optimal Number of Trees in Random Forests

Authors: Songul Cinaroglu

Abstract:

Background: Random Forest is an efficient, multi-class machine learning method using for classification, regression and other tasks. This method is operating by constructing each tree using different bootstrap sample of the data. Determining the number of trees in random forests is an open question in the literature for studies about improving classification performance of random forests. Aim: The aim of this study is to analyze whether there is an optimal number of trees in Random Forests and how performance of Random Forests differ according to increase in number of trees using sample health data sets in R programme. Method: In this study we analyzed the performance of Random Forests as the number of trees grows and doubling the number of trees at every iteration using “random forest” package in R programme. For determining minimum and optimal number of trees we performed Mc Nemar test and Area Under ROC Curve respectively. Results: At the end of the analysis it was found that as the number of trees grows, it does not always means that the performance of the forest is better than forests which have fever trees. In other words larger number of trees only increases computational costs but not increases performance results. Conclusion: Despite general practice in using random forests is to generate large number of trees for having high performance results, this study shows that increasing number of trees doesn’t always improves performance. Future studies can compare different kinds of data sets and different performance measures to test whether Random Forest performance results change as number of trees increase or not.

Keywords: classification methods, decision trees, number of trees, random forest

Procedia PDF Downloads 373
14352 Activity-Based Safety Assessment of Real Estate Projects in Western India

Authors: Patel Parul, Harsh Ganvit

Abstract:

The construction industry is the second highest industry after agriculture provides employment in India. In developing countries like India, many construction projects are coming up to meet the demand. On the one hand, construction projects are increasing; on the other hand still, construction companies are struggling with many problems. One of the major problems is to ensure safe working conditions at the construction site. Due to a lack of safety awareness and ignorance of safety aspects, many fatal accidents are very common at the construction site in India. One of the key success factors for construction projects is “Accident-Free Construction Projects”. The construction projects can be divided into various categories like Infrastructure projects, industrial construction and real estate construction. Real estate projects are mainly comprised of commercial and residential projects. In the construction industry, private sectors play a huge role in urban and rural development and also contribute significantly to the growth of the nation. Infrastructure and Industrial projects are mainly executed by well-qualified construction contractors. For such projects, ensuring safety at construction projects is inevitable and probably one of the major clauses of contract documents as well. These projects are monitored from time to time by national agencies and researchers, too. However, Real estate projects are rarely monitored for safety aspects. No systematic contract system is followed for these projects. Safety is the most neglected aspect of these projects. In the current research projects, an attempt is made to carry out safety auditing for about 75 real estate projects. The objective of this work is to collect the activity-based safety survey of real estate projects in western India. The analysis of activity-based safety implementation for real estate projects is discussed in the present work. The activities are divided into three categories based on the data collected. The findings of this work will help local monitoring authorities to implement a safety management plan for real estate projects.

Keywords: construction safety, safety assessment, activity-based safety, real estate projects

Procedia PDF Downloads 23
14351 Source-Detector Trajectory Optimization for Target-Based C-Arm Cone Beam Computed Tomography

Authors: S. Hatamikia, A. Biguri, H. Furtado, G. Kronreif, J. Kettenbach, W. Birkfellner

Abstract:

Nowadays, three dimensional Cone Beam CT (CBCT) has turned into a widespread clinical routine imaging modality for interventional radiology. In conventional CBCT, a circular sourcedetector trajectory is used to acquire a high number of 2D projections in order to reconstruct a 3D volume. However, the accumulated radiation dose due to the repetitive use of CBCT needed for the intraoperative procedure as well as daily pretreatment patient alignment for radiotherapy has become a concern. It is of great importance for both health care providers and patients to decrease the amount of radiation dose required for these interventional images. Thus, it is desirable to find some optimized source-detector trajectories with the reduced number of projections which could therefore lead to dose reduction. In this study we investigate some source-detector trajectories with the optimal arbitrary orientation in the way to maximize performance of the reconstructed image at particular regions of interest. To achieve this approach, we developed a box phantom consisting several small target polytetrafluoroethylene spheres at regular distances through the entire phantom. Each of these spheres serves as a target inside a particular region of interest. We use the 3D Point Spread Function (PSF) as a measure to evaluate the performance of the reconstructed image. We measured the spatial variance in terms of Full-Width-Half-Maximum (FWHM) of the local PSFs each related to a particular target. The lower value of FWHM shows the better spatial resolution of reconstruction results at the target area. One important feature of interventional radiology is that we have very well-known imaging targets as a prior knowledge of patient anatomy (e.g. preoperative CT) is usually available for interventional imaging. Therefore, we use a CT scan from the box phantom as the prior knowledge and consider that as the digital phantom in our simulations to find the optimal trajectory for a specific target. Based on the simulation phase we have the optimal trajectory which can be then applied on the device in real situation. We consider a Philips Allura FD20 Xper C-arm geometry to perform the simulations and real data acquisition. Our experimental results based on both simulation and real data show our proposed optimization scheme has the capacity to find optimized trajectories with minimal number of projections in order to localize the targets. Our results show the proposed optimized trajectories are able to localize the targets as good as a standard circular trajectory while using just 1/3 number of projections. Conclusion: We demonstrate that applying a minimal dedicated set of projections with optimized orientations is sufficient to localize targets, may minimize radiation.

Keywords: CBCT, C-arm, reconstruction, trajectory optimization

Procedia PDF Downloads 112
14350 Human Papillomavirus Type 16 E4 Gene Variation as Risk Factor for Cervical Cancer

Authors: Yudi Zhao, Ziyun Zhou, Yueting Yao, Shuying Dai, Zhiling Yan, Longyu Yang, Chuanyin Li, Li Shi, Yufeng Yao

Abstract:

HPV16 E4 gene plays an important role in viral genome amplification and release. Therefore, a variation of the E4 gene nucleic acid sequence may affect the carcinogenicity of HPV16. In order to understand the relationship between the variation of HPV16 E4 gene and cervical cancer, this study was to amplify and sequence the DNA sequences of E4 genes in 118 HPV16-positive cervical cancer patients and 151 HPV16-positive asymptomatic individuals. After obtaining E4 gene sequences, the phylogenetic trees were constructed by the Neighbor-joining method for gene variation analysis. The results showed that: 1) The distribution of HPV16 variants between the case group and the control group differed greatly (P = 0.015),and the Asian-American(AA)variant was likely to relate to the occurrence of cervical cancer. 2) DNA sequence analysis showed that there were significant differences in the distribution of 8 variants between the case group and the control group (P < 0.05). And 3) In European (EUR) variant, two variations, C3384T (L18L) and A3449G (P39P), were associated with the initiation and development of cervical cancer. The results suggested that the variation of HPV16 E4 gene may be a contributor affecting the occurrence as well as the development of cervical cancer, and different HPV16 variants may have different carcinogenic capability.

Keywords: cervical cancer, HPV16, E4 gene, variations

Procedia PDF Downloads 142
14349 IT-Aided Business Process Enabling Real-Time Analysis of Candidates for Clinical Trials

Authors: Matthieu-P. Schapranow

Abstract:

Recruitment of participants for clinical trials requires the screening of a big number of potential candidates, i.e. the testing for trial-specific inclusion and exclusion criteria, which is a time-consuming and complex task. Today, a significant amount of time is spent on identification of adequate trial participants as their selection may affect the overall study results. We introduce a unique patient eligibility metric, which allows systematic ranking and classification of candidates based on trial-specific filter criteria. Our web application enables real-time analysis of patient data and assessment of candidates using freely definable inclusion and exclusion criteria. As a result, the overall time required for identifying eligible candidates is tremendously reduced whilst additional degrees of freedom for evaluating the relevance of individual candidates are introduced by our contribution.

Keywords: in-memory technology, clinical trials, screening, eligibility metric, data analysis, clustering

Procedia PDF Downloads 464
14348 On the Utility of Bidirectional Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of the flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on the spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts, as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with an attention mechanism. In previous works on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work, with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on the presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: machine learning, classification and regression, gene circuit design, bidirectional transformers

Procedia PDF Downloads 33
14347 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation

Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta

Abstract:

Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.

Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal

Procedia PDF Downloads 287
14346 Risk and Impact of the COVID-19 Crisis on Real Estate

Authors: Tahmina Akhter

Abstract:

In the present work, we make a study of the repercussions of the pandemic generated by Covid-19 in the real estate market, this disease has affected almost all sectors of the economy across different countries in the world, including the real estate markets. This documentary research, basically focused on the years 2021 and 2022, as we seek to focus on the strongest time of the pandemic. We carried out the study trying to take into account the repercussions throughout the world and that is why the data we analyze takes into account information from all continents as possible. Particularly in the US, Europe and China where the Covid-19 impact has been of such proportions that it has fundamentally affected the housing market for middle-class housing. In addition, a risk has been generated, the investment of this market, due to the fact that companies in the sector have generated losses in certain cases; in the Chinese case, Evergrande, one of the largest companies in the sector, fell into default.

Keywords: COVID-19, real estate market, statistics, pandemic

Procedia PDF Downloads 61
14345 Debris' Effect on Bearing Capacity of Defective Piles in Sand

Authors: A. M. Nasr, W. R. Azzam, K. E. Ebeed

Abstract:

For bored piles, careful cleaning must be used to reduce the amount of material trapped in the drilled hole; otherwise, the debris' presence might cause the soft toe effect, which would affect the axial resistance. There isn't much comprehensive research on bored piles with debris. In order to investigate the behavior of a single pile, a pile composite foundation, a two pile group, a three pile group and a four pile group investigation conducts, forty-eight numerical tests in which the debris is simulated using foam rubber.1m pile diameter and 10m length with spacing 3D and depth of foundation 1m used in this study. It is found that the existence of debris causes a reduction of bearing capacity by 64.58% and 33.23% for single pile and pile composite foundation, respectively, 23.27% and 24.24% for the number of defective piles / total number of pile =1/2 and 1 respectively for two group pile, 10.23%, 19.42% and 28.47% for the number of defective piles / total number of pile =1/3,2/3 and 1 respectively for three group pile and, this reduction increase with the increase in a number of defective piles / a total number of piles and 7.1%, 13.32%,19.02% and 26.36 for the number of defective piles / total number of pile =1/4,2/4,3/4 and 1 respectively for four group pile and decreases with an increase of number of pile duo to interaction effect.

Keywords: debris, Foundation, defective, interaction, board pile

Procedia PDF Downloads 60
14344 Rhizosphere Microbial Communities in Fynbos Endemic Legumes during Wet and Dry Seasons

Authors: Tiisetso Mpai, Sanjay K. Jaiswal, Felix D. Dakora

Abstract:

The South African Cape fynbos biome is a global biodiversity hotspot. This biome contains a diversity of endemic shrub legumes, including Polhillia, Wiborgia, and Wiborgiella species, which are important for ecotourism as well as for improving soil fertility status. This is due to their proven N₂-fixing abilities when in association with compatible soil bacteria. In fact, Polhillia, Wiborgia, and Wiborgiella species have been reported to derive over 61% of their needed nitrogen through biological nitrogen fixation and to exhibit acid and alkaline phosphatase activity in their rhizospheres. Thus, their interactions with soil microbes may explain their survival mechanisms under the continued summer droughts and acidic, nutrient-poor soils in this region. However, information regarding their rhizosphere microbiome is still unavailable, yet it is important for Fynbos biodiversity management. Therefore, the aim of this study was to assess the microbial community structures associated with rhizosphere soils of Polhillia pallens, Polhillia brevicalyx, Wiborgia obcordata, Wiborgia sericea, and Wiborgiella sessilifolia growing at different locations of the South African Cape fynbos, during the wet and dry seasons. The hypothesis is that the microbial communities in these legume rhizospheres are the same type and are not affected by the growing season due to the restricted habitat of these wild fynbos legumes. To obtain the results, DNA was extracted from 0.5 g of each rhizosphere soil using PowerSoil™ DNA Isolation Kit, and sequences were obtained using the 16S rDNA Miseq Illumina technology. The results showed that in both seasons, bacteria were the most abundant microbial taxa in the rhizosphere soils of all five legume species, with Actinobacteria showing the highest number of sequences (about 30%). However, over 19.91% of the inhabitants in all five legume rhizospheres were unclassified. In terms of genera, Mycobacterium and Conexibacter were common in rhizosphere soils of all legumes in both seasons except for W. obcordata soils sampled during the dry season, which had Dehalogenimonas as the major inhabitant (6.08%). In conclusion, plant species and season were found to be the main drivers of microbial community structure in Cape fynbos, with the wet season being more dominant in shaping microbial diversity relative to the dry season. Wiborgia obcordata had a greater influence on microbial community structure than the other four legume species.

Keywords: 16S rDNA, Cape fynbos, endemic legumes, microbiome, rhizosphere

Procedia PDF Downloads 123
14343 An ALM Matrix Completion Algorithm for Recovering Weather Monitoring Data

Authors: Yuqing Chen, Ying Xu, Renfa Li

Abstract:

The development of matrix completion theory provides new approaches for data gathering in Wireless Sensor Networks (WSN). The existing matrix completion algorithms for WSN mainly consider how to reduce the sampling number without considering the real-time performance when recovering the data matrix. In order to guarantee the recovery accuracy and reduce the recovery time consumed simultaneously, we propose a new ALM algorithm to recover the weather monitoring data. A lot of experiments have been carried out to investigate the performance of the proposed ALM algorithm by using different parameter settings, different sampling rates and sampling models. In addition, we compare the proposed ALM algorithm with some existing algorithms in the literature. Experimental results show that the ALM algorithm can obtain better overall recovery accuracy with less computing time, which demonstrate that the ALM algorithm is an effective and efficient approach for recovering the real world weather monitoring data in WSN.

Keywords: wireless sensor network, matrix completion, singular value thresholding, augmented Lagrange multiplier

Procedia PDF Downloads 357
14342 Number Sense Proficiency and Problem Solving Performance of Grade Seven Students

Authors: Laissa Mae Francisco, John Rolex Ingreso, Anna Krizel Menguito, Criselda Robrigado, Rej Maegan Tuazon

Abstract:

This study aims to determine and describe the existing relationship between number sense proficiency and problem-solving performance of grade seven students from Victorino Mapa High School, Manila. A paper pencil exam containing of 50-item number sense test and 5-item problem-solving test which measures their number sense proficiency and problem-solving performance adapted from McIntosh, Reys, and Bana were used as the research instruments. The data obtained from this study were interpreted and analyzed using the Pearson – Product Moment Coefficient of Correlation to determine the relationship between the two variables. It was found out that students who were low in number sense proficiency tend to be the students with poor problem-solving performance and students with medium number sense proficiency are most likely to have an average problem-solving performance. Likewise, students with high number sense proficiency are those who do excellently in problem-solving performance.

Keywords: number sense, performance, problem solving, proficiency

Procedia PDF Downloads 397
14341 Environmental Sustainability and Energy Consumption: The Role of Financial Development in OPEC-1 Countries

Authors: Isah Wada

Abstract:

The current research investigates the role of financial development in an environmental sustainability-energy consumption nexus for OPEC-1 member countries. The empirical findings suggest that financial development increases environmental sustainability but energy consumption and real output expansion diminishes environmental sustainability, generally. Thus, whilst real output and financial development accelerates energy consumption, environmental sustainability quality diminishes clean energy initiatives. Even more so, energy consumption and financial development stimulates real output growth. The result empirically demonstrates that policy advocates must address broader issues relating to financial development whilst seeking to achieve environmental sustainability due largely to energy consumption.

Keywords: energy consumption, environmental sustainability, financial development, OPEC, real output

Procedia PDF Downloads 150
14340 Detailed Depositional Resolutions in Upper Miocene Sands of HT-3X Well, Nam Con Son Basin, Vietnam

Authors: Vo Thi Hai Quan

Abstract:

Nam Con Son sedimentary basin is one of the very important oil and gas basins in offshore Vietnam. Hai Thach field of block 05-2 contains mostly gas accumulations in fine-grained, sand/mud-rich turbidite system, which was deposited in a turbidite channel and fan environment. Major Upper Miocene reservoir of HT-3X lies above a well-developed unconformity. The main objectives of this study are to reconstruct depositional environment and to assess the reservoir quality using data from 14 meters of core samples and digital wireline data of the well HT-3X. The wireline log and core data showed that the vertical sequences of representative facies of the well mainly range from Tb to Te divisions of Bouma sequences with predominance of Tb and Tc compared to Td and Te divisions. Sediments in this well were deposited in a submarine fan association with very fine to fine-grained, homogeneous sandstones that have high porosity and permeability, high- density turbidity currents with longer transport route from the sediment source to the basin, indicating good quality of reservoir. Sediments are comprised mainly of the following sedimentary structures: massive, laminated sandstones, convoluted bedding, laminated ripples, cross-laminated ripples, deformed sandstones, contorted bedding.

Keywords: Hai Thach field, Miocene sand, turbidite, wireline data

Procedia PDF Downloads 274
14339 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification

Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos

Abstract:

Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.

Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology

Procedia PDF Downloads 124
14338 New Concept for Real Time Selective Harmonics Elimination Based on Lagrange Interpolation Polynomials

Authors: B. Makhlouf, O. Bouchhida, M. Nibouche, K. Laidi

Abstract:

A variety of methods for selective harmonics elimination pulse width modulation have been developed, the most frequently used for real-time implementation based on look-up tables method. To address real-time requirements based in modified carrier signal is proposed in the presented work, with a general formulation to real-time harmonics control/elimination in switched inverters. Firstly, the proposed method has been demonstrated for a single value of the modulation index. However, in reality, this parameter is variable as a consequence of the voltage (amplitude) variability. In this context, a simple interpolation method for calculating the modified sine carrier signal is proposed. The method allows a continuous adjustment in both amplitude and frequency of the fundamental. To assess the performance of the proposed method, software simulations and hardware experiments have been carried out in the case of a single-phase inverter. Obtained results are very satisfactory.

Keywords: harmonic elimination, Particle Swarm Optimisation (PSO), polynomial interpolation, pulse width modulation, real-time harmonics control, voltage inverter

Procedia PDF Downloads 476
14337 Entropy Production in Mixed Convection in a Horizontal Porous Channel Using Darcy-Brinkman Formulation

Authors: Amel Tayari, Atef Eljerry, Mourad Magherbi

Abstract:

The paper reports a numerical investigation of the entropy generation analysis due to mixed convection in laminar flow through a channel filled with porous media. The second law of thermodynamics is applied to investigate the entropy generation rate. The Darcy-Brinkman Model is employed. The entropy generation due to heat transfer and friction dissipations has been determined in mixed convection by solving numerically the continuity, momentum and energy equations, using a control volume finite element method. The effects of Darcy number, modified Brinkman number and the Rayleigh number on averaged entropy generation and averaged Nusselt number are investigated. The Rayleigh number varied between 103 ≤ Ra ≤ 105 and the modified Brinkman number ranges between 10-5 ≤ Br≤ 10-1 with fixed values of porosity and Reynolds number at 0.5 and 10 respectively. The Darcy number varied between 10-6 ≤ Da ≤10.

Keywords: entropy generation, porous media, heat transfer, mixed convection, numerical methods, darcy, brinkman

Procedia PDF Downloads 373
14336 A Unified Deep Framework for Joint 3d Pose Estimation and Action Recognition from a Single Color Camera

Authors: Huy Hieu Pham, Houssam Salmane, Louahdi Khoudour, Alain Crouzil, Pablo Zegers, Sergio Velastin

Abstract:

We present a deep learning-based multitask framework for joint 3D human pose estimation and action recognition from color video sequences. Our approach proceeds along two stages. In the first, we run a real-time 2D pose detector to determine the precise pixel location of important key points of the body. A two-stream neural network is then designed and trained to map detected 2D keypoints into 3D poses. In the second, we deploy the Efficient Neural Architecture Search (ENAS) algorithm to find an optimal network architecture that is used for modeling the Spatio-temporal evolution of the estimated 3D poses via an image-based intermediate representation and performing action recognition. Experiments on Human3.6M, Microsoft Research Redmond (MSR) Action3D, and Stony Brook University (SBU) Kinect Interaction datasets verify the effectiveness of the proposed method on the targeted tasks. Moreover, we show that our method requires a low computational budget for training and inference.

Keywords: human action recognition, pose estimation, D-CNN, deep learning

Procedia PDF Downloads 115