Search results for: Minimum data set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25504

Search results for: Minimum data set

25384 Evaluation of Social Studies Curriculum Implementation of Bachelor of Education Degree in Colleges of Education in Southwestern Nigeria

Authors: F. A. Adesoji, A. A. Ayandele

Abstract:

There has been a concern over non-responsiveness of educational programme in Nigeria’s higher institutions to adequately meet social needs. The study, therefore, investigated the effectiveness of basic elements of the Social Studies Curriculum, the contributions of the Teacher–Related Variables (TRV) such as qualification, area of specialization, teaching experience, teaching methods, gender and teaching facilities to the implementation of the curriculum (IOC) in the Colleges of Education (COEs). The study adopted the descriptive survey design. Four COEs in Oyo, Osun, Ondo and Lagos States were purposively selected. Stratified sampling technique was used to select 455 Social Studies students and 47 Social Studies lecturers. Stakeholders’ Perception of Social Studies Curriculum (r = 0.86), Social Studies Curriculum Resources scale (r = 0.78) and Social Studies Basic Concepts Test (r = 0.78) were used for data collection. Data were analysed using descriptive statistics, multiple regression, and t-test at 0.05 level of significance. COEs teachers and students rated the elements of the curriculum to be effective with mean scores x̄ =3.02 and x̄ =2.80 respectively; x̄ =5.00 and x̄ = 2.50 being the maximum and minimum mean scores. The finding showed average level of availability (x̄ =1.60), adequacy (x̄ =1.55) and utilization (x̄ =1.64) of teaching materials, x̄ =3.00 and x̄ =1.50 being maximum and minimum mean scores respectively. Academic performance of the students is on average with the mean score of x̄ =51.4775 out of maximum mean score of x̄ =100. The TRV and teaching facilities had significant composite contribution to IOC (F (6,45) = 3.92:R² = 0.26) with 39% contributions to the variance of IOC. Area of specialization (β= 29, t = 2.05) and teaching facilities (β = -25, t = 1.181) contributed significantly. The implementation of bachelor degree in Social Studies curriculum was effective in the colleges of education. There is the need to beef-up the provision of facilities to improve the implementation of the curriculum.

Keywords: bachelor degree in social studies, colleges of education in southwestern Nigeria, curriculum implementation, social studies curriculum

Procedia PDF Downloads 358
25383 Applying Spanning Tree Graph Theory for Automatic Database Normalization

Authors: Chetneti Srisa-an

Abstract:

In Knowledge and Data Engineering field, relational database is the best repository to store data in a real world. It has been using around the world more than eight decades. Normalization is the most important process for the analysis and design of relational databases. It aims at creating a set of relational tables with minimum data redundancy that preserve consistency and facilitate correct insertion, deletion, and modification. Normalization is a major task in the design of relational databases. Despite its importance, very few algorithms have been developed to be used in the design of commercial automatic normalization tools. It is also rare technique to do it automatically rather manually. Moreover, for a large and complex database as of now, it make even harder to do it manually. This paper presents a new complete automated relational database normalization method. It produces the directed graph and spanning tree, first. It then proceeds with generating the 2NF, 3NF and also BCNF normal forms. The benefit of this new algorithm is that it can cope with a large set of complex function dependencies.

Keywords: relational database, functional dependency, automatic normalization, primary key, spanning tree

Procedia PDF Downloads 328
25382 An Eigen-Approach for Estimating the Direction-of Arrival of Unknown Number of Signals

Authors: Dia I. Abu-Al-Nadi, M. J. Mismar, T. H. Ismail

Abstract:

A technique for estimating the direction-of-arrival (DOA) of unknown number of source signals is presented using the eigen-approach. The eigenvector corresponding to the minimum eigenvalue of the autocorrelation matrix yields the minimum output power of the array. Also, the array polynomial with this eigenvector possesses roots on the unit circle. Therefore, the pseudo-spectrum is found by perturbing the phases of the roots one by one and calculating the corresponding array output power. The results indicate that the DOAs and the number of source signals are estimated accurately in the presence of a wide range of input noise levels.

Keywords: array signal processing, direction-of-arrival, antenna arrays, Eigenvalues, Eigenvectors, Lagrange multiplier

Procedia PDF Downloads 306
25381 Infant and Young Child-Feeding Practices in Mongolia

Authors: Otgonjargal Damdinbaljir

Abstract:

Background: Infant feeding practices have a major role in determining the nutritional status of children and are associated with household socioeconomic and demographic factors. In 2010, Mongolia used WHO 2008 edition of Indicators for assessing infant and young child feeding practices for the first time. Objective: To evaluate the feeding status of infants and young children under 2 years old in Mongolia. Materials and Methods: The study was conducted by cluster random sampling. The data on breastfeeding and complementary food supplement of the 350 infants and young children aged 0-23 months in 21 provinces of the 4 economic regions of the country and capital Ulaanbaatar city were collected through questionnaires. The feeding status was analyzed according to the WHO 2008 edition of Indicators for assessing infant and young child feeding practices. Analysis of data: Survey data was analysed using the PASW statistics 18.0 and EPI INFO 2000 software. For calculation of overall measures for the entire survey sample, analyses were stratified by region. Age-specific feeding patterns were described using frequencies, proportions and survival analysis. Logistic regression was done with feeding practice as dependent and socio demographic factors as independent variables. Simple proportions were calculated for each IYCF indicator. The differences in the feeding practices between sexes and age-groups, if any, were noted using chi-square test. Ethics: The Ethics Committee under the auspices of the Ministry of Health approved the study. Results: A total of 350 children aged 0-23 months were investigated. The rate of ever breastfeeding of children aged 0-23 months reached up to 98.2%, while the percentage of early initiation of breastfeeding was only 85.5%. The rates of exclusive breastfeeding under 6 months, continued breastfeeding for 1 year, and continued breastfeeding for 2 years were 71.3%, 74% and 54.6%, respectively. The median time of giving complementary food was the 6th month and the weaning time was the 9th month. The rate of complementary food supplemented from 6th-8th month in time was 80.3%. The rates of minimum dietary diversity, minimum meal frequency, and consumption of iron-rich or iron-fortified foods among children aged 6-23 months were 52.1%, 80.8% (663/813) and 30.1%, respectively. Conclusions: The main problems revealed from the study were inadequate category and frequency of complementary food, and the low rate of consumption of iron-rich or iron-fortified foods were the main issues to be concerned on infant feeding in Mongolia. Our findings have highlighted the need to encourage mothers to enrich their traditional wheat- based complementary foods add more animal source foods and vegetables.

Keywords: complementary feeding, early initiation of breastfeeding, exclusive breastfeeding, minimum meal frequency

Procedia PDF Downloads 456
25380 A Comparative Analysis of Global Minimum Variance and Naïve Portfolios: Performance across Stock Market Indices and Selected Economic Regimes Using Various Risk-Return Metrics

Authors: Lynmar M. Didal, Ramises G. Manzano Jr., Jacque Bon-Isaac C. Aboy

Abstract:

This study analyzes the performance of global minimum variance and naive portfolios across different economic periods, using monthly stock returns from the Philippine Stock Exchange Index (PSEI), S&P 500, and Dow Jones Industrial Average (DOW). The performance is evaluated through the Sharpe ratio, Sortino ratio, Jensen’s Alpha, Treynor ratio, and Information ratio. Additionally, the study investigates the impact of short selling on portfolio performance. Six-time periods are defined for analysis, encompassing events such as the global financial crisis and the COVID-19 pandemic. Findings indicate that the Naive portfolio generally outperforms the GMV portfolio in the S&P 500, signifying higher returns with increased volatility. Conversely, in the PSEI and DOW, the GMV portfolio shows more efficient risk-adjusted returns. Short selling significantly impacts the GMV portfolio during mid-GFC and mid-COVID periods. The study offers insights for investors, suggesting the Naive portfolio for higher risk tolerance and the GMV portfolio as a conservative alternative.

Keywords: portfolio performance, global minimum variance, naïve portfolio, risk-adjusted metrics, short-selling

Procedia PDF Downloads 58
25379 Develop a Software to Hydraulic Redesign a Depropanizer Column to Minimize Energy Consumption

Authors: Mahdi Goharrokhi, Rasool Shiri, Eiraj Naser

Abstract:

A depropanizer column of a particular refinery was redesigned in this work. That is, minimum reflux ratio, minimum number of trays, feed tray location and the hydraulic characteristics of the tower were calculated and compared with the actual values of the existing tower. To Design review of the tower, fundamental equations were used to develop software which its results were compared with two commercial software results. In each case PR EOS was used. Based on the total energy consumption in reboiler and condenser, feed tray location was also determined using case study definition for tower.

Keywords: column, hydraulic design, pressure drop, energy consumption

Procedia PDF Downloads 390
25378 A Hybrid Based Algorithm to Solve the Multi-objective Minimum Spanning Tree Problem

Authors: Boumesbah Asma, Chergui Mohamed El-amine

Abstract:

Since it has been shown that the multi-objective minimum spanning tree problem (MOST) is NP-hard even with two criteria, we propose in this study a hybrid NSGA-II algorithm with an exact mutation operator, which is only used with low probability, to find an approximation to the Pareto front of the problem. In a connected graph G, a spanning tree T of G being a connected and cycle-free graph, if k edges of G\T are added to T, we obtain a partial graph H of G inducing a reduced size multi-objective spanning tree problem compared to the initial one. With a weak probability for the mutation operator, an exact method for solving the reduced MOST problem considering the graph H is then used to give birth to several mutated solutions from a spanning tree T. Then, the selection operator of NSGA-II is activated to obtain the Pareto front approximation. Finally, an adaptation of the VNS metaheuristic is called for further improvements on this front. It allows finding good individuals to counterbalance the diversification and the intensification during the optimization search process. Experimental comparison studies with an exact method show promising results and indicate that the proposed algorithm is efficient.

Keywords: minimum spanning tree, multiple objective linear optimization, combinatorial optimization, non-sorting genetic algorithm, variable neighborhood search

Procedia PDF Downloads 64
25377 Reliability Analysis: A Case Study in Designing Power Distribution System of Tehran Oil Refinery

Authors: A. B. Arani, R. Shojaee

Abstract:

Electrical power distribution system is one of the vital infrastructures of an oil refinery, which requires wide area of study and planning before construction. In this paper, power distribution reliability of Tehran Refinery’s KHDS/GHDS unit has been taken into consideration to investigate the importance of these kinds of studies and evaluate the designed system. In this regard, the authors chose and evaluated different configurations of electrical power distribution along with the existing configuration with the aim of finding the most suited configuration which satisfies the conditions of minimum cost of electrical system construction, minimum cost imposed by loss of load, and maximum power system reliability.

Keywords: power distribution system, oil refinery, reliability, investment cost, interruption cost

Procedia PDF Downloads 842
25376 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand

Authors: Esma Birisci, Ronald McGarvey

Abstract:

One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.

Keywords: environmental studies, food waste, production planning, uncertain and correlated demand

Procedia PDF Downloads 341
25375 Inbreeding Study Using Runs of Homozygosity in Nelore Beef Cattle

Authors: Priscila A. Bernardes, Marcos E. Buzanskas, Luciana C. A. Regitano, Ricardo V. Ventura, Danisio P. Munari

Abstract:

The best linear unbiased predictor (BLUP) is a method commonly used in genetic evaluations of breeding programs. However, this approach can lead to higher inbreeding coefficients in the population due to the intensive use of few bulls with higher genetic potential, usually presenting some degree of relatedness. High levels of inbreeding are associated to low genetic viability, fertility, and performance for some economically important traits and therefore, should be constantly monitored. Unreliable pedigree data can also lead to misleading results. Genomic information (i.e., single nucleotide polymorphism – SNP) is a useful tool to estimate the inbreeding coefficient. Runs of homozygosity have been used to evaluate homozygous segments inherited due to direct or collateral inbreeding and allows inferring population selection history. This study aimed to evaluate runs of homozygosity (ROH) and inbreeding in a population of Nelore beef cattle. A total of 814 animals were genotyped with the Illumina BovineHD BeadChip and the quality control was carried out excluding SNPs located in non-autosomal regions, with unknown position, with a p-value in the Hardy-Weinberg equilibrium lower than 10⁻⁵, call rate lower than 0.98 and samples with the call rate lower than 0.90. After the quality control, 809 animals and 509,107 SNPs remained for analyses. For the ROH analysis, PLINK software was used considering segments with at least 50 SNPs with a minimum length of 1Mb in each animal. The inbreeding coefficient was calculated using the ratio between the sum of all ROH sizes and the size of the whole genome (2,548,724kb). A total of 25.711 ROH were observed, presenting mean, median, minimum, and maximum length of 3.34Mb, 2Mb, 1Mb, and 80.8Mb, respectively. The number of SNPs present in ROH segments varied from 50 to 14.954. The longest ROH length was observed in one animal, which presented a length of 634Mb (24.88% of the genome). Four bulls were among the 10 animals with the longest extension of ROH, presenting 11% of ROH with length higher than 10Mb. Segments longer than 10Mb indicate recent inbreeding. Therefore, the results indicate an intensive use of few sires in the studied data. The distribution of ROH along the chromosomes showed that chromosomes 5 and 6 presented a large number of segments when compared to other chromosomes. The mean, median, minimum, and maximum inbreeding coefficients were 5.84%, 5.40%, 0.00%, and 24.88%, respectively. Although the mean inbreeding was considered low, the ROH indicates a recent and intensive use of few sires, which should be avoided for the genetic progress of breed.

Keywords: autozygosity, Bos taurus indicus, genomic information, single nucleotide polymorphism

Procedia PDF Downloads 126
25374 Prospective Study on the Efficacy of Bio Absorbable Screws in Treatment of Osteochondral Fractures

Authors: S. Anwar Sathik, K. Manoj Deepak, K. Venkatachalam

Abstract:

Our study is a prospective study on the use of bio absorbable pins for the treatment of osteochondral fractures after patellar dislocation.22 patients who presented with osteochondral fractures were treated in our institution. They were followed for a minimum of 12 months by regular radiological evaluation. Of the 22 patients, 2 had fragments that detached from the fracture site which was treated arthroscopically. All the patients underwent open reduction and fixation of the pins using bio absorbable crews. They were immobilized in the cast for a minimum of 6 weeks after which mobilization was started according to our protocol. Fracture consolidation was found to occur in 20 of the 22 patients. Thus, Bio absorbable screws can be used as a reliable method of fixation of the osteochondral fragments.

Keywords: osteochondral fracture, bio absorbable pins, patella dislocation, physiotherapy

Procedia PDF Downloads 279
25373 Hybridized Approach for Distance Estimation Using K-Means Clustering

Authors: Ritu Vashistha, Jitender Kumar

Abstract:

Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.

Keywords: ant colony optimization, data clustering, centroids, data mining, k-means

Procedia PDF Downloads 101
25372 Assessment of Hargreaves Equation for Estimating Monthly Reference Evapotranspiration in the South of Iran

Authors: Ali Dehgan Moroozeh, B. Farhadi Bansouleh

Abstract:

Evapotranspiration is one of the most important components of the hydrological cycle. Evapotranspiration (ETo) is an important variable in water and energy balances on the earth’s surface, and knowledge of the distribution of ET is a key factor in hydrology, climatology, agronomy and ecology studies. Many researchers have a valid relationship, which is a function of climate factors, to estimate the potential evapotranspiration presented to the plant water stress or water loss, prevent. The FAO-Penman method (PM) had been recommended as a standard method. This method requires many data and these data are not available in every area of world. So, other methods should be evaluated for these conditions. When sufficient or reliable data to solve the PM equation are not available then Hargreaves equation can be used. The Hargreaves equation (HG) requires only daily mean, maximum and minimum air temperature extraterrestrial radiation .In this study, Hargreaves method (HG) were evaluated in 12 stations in the North West region of Iran. Results of HG and M.HG methods were compared with results of PM method. Statistical analysis of this comparison showed that calibration process has had significant effect on efficiency of Hargreaves method.

Keywords: evapotranspiration, hargreaves, equation, FAO-Penman method

Procedia PDF Downloads 371
25371 Efficient Antenna Array Beamforming with Robustness against Random Steering Mismatch

Authors: Ju-Hong Lee, Ching-Wei Liao, Kun-Che Lee

Abstract:

This paper deals with the problem of using antenna sensors for adaptive beamforming in the presence of random steering mismatch. We present an efficient adaptive array beamformer with robustness to deal with the considered problem. The robustness of the proposed beamformer comes from the efficient designation of the steering vector. Using the received array data vector, we construct an appropriate correlation matrix associated with the received array data vector and a correlation matrix associated with signal sources. Then, the eigenvector associated with the largest eigenvalue of the constructed signal correlation matrix is designated as an appropriate estimate of the steering vector. Finally, the adaptive weight vector required for adaptive beamforming is obtained by using the estimated steering vector and the constructed correlation matrix of the array data vector. Simulation results confirm the effectiveness of the proposed method.

Keywords: adaptive beamforming, antenna array, linearly constrained minimum variance, robustness, steering vector

Procedia PDF Downloads 170
25370 Analysis of Arthroscopic Rotator Cuff Repair

Authors: Prakash Karrun, M. Manoj Deepak, Mathivanan, K. Venkatachalam

Abstract:

Our study aims to evaluate the rates of healing and the efficacy of the arthroscopic repair of the rotator cuff tears. 40 patients who had rotator cuff tears were taken up for the study and arthroscopic repair was done with double row technique.They were evaluated and followed up for a minimum of 2 years minimum.The functional status,range of motion and healing rates were compared post operatively. All the patients were followed up with serial questionnaires and MRI at the end of 2 years. There was significant improvement in the functional status of the patient. The MRI showed better rates of healing in these patients.Thus our study effectively proves the efficacy of our operating technique.

Keywords: rotator cuff tear, arthroscopic repair, double stich, healing

Procedia PDF Downloads 322
25369 Some Codes for Variants in Graphs

Authors: Sofia Ait Bouazza

Abstract:

We consider the problem of finding a minimum identifying code in a graph. This problem was initially introduced in 1998 and has been since fundamentally connected to a wide range of applications (fault diagnosis, location detection …). Suppose we have a building into which we need to place fire alarms. Suppose each alarm is designed so that it can detect any fire that starts either in the room in which it is located or in any room that shares a doorway with the room. We want to detect any fire that may occur or use the alarms which are sounding to not only to not only detect any fire but be able to tell exactly where the fire is located in the building. For reasons of cost, we want to use as few alarms as necessary. The first problem involves finding a minimum domination set of a graph. If the alarms are three state alarms capable of distinguishing between a fire in the same room as the alarm and a fire in an adjacent room, we are trying to find a minimum locating domination set. If the alarms are two state alarms that can only sound if there is a fire somewhere nearby, we are looking for a differentiating domination set of a graph. These three areas are the subject of much active research; we primarily focus on the third problem. An identifying code of a graph G is a dominating set C such that every vertex x of G is distinguished from other vertices by the set of vertices in C that are at distance at most r≥1 from x. When only vertices out of the code are asked to be identified, we get the related concept of a locating dominating set. The problem of finding an identifying code (resp a locating dominating code) of minimum size is a NP-hard problem, even when the input graph belongs to a number of specific graph classes. Therefore, we study this problem in some restricted classes of undirected graphs like split graph, line graph and path in a directed graph. Then we present some results on the identifying code by giving an exact value of upper total locating domination and a total 2-identifying code in directed and undirected graph. Moreover we determine exact values of locating dominating code and edge identifying code of thin headless spider and locating dominating code of complete suns.

Keywords: identiying codes, locating dominating set, split graphs, thin headless spider

Procedia PDF Downloads 437
25368 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm

Authors: Anuradha Chug, Sunali Gandhi

Abstract:

Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.

Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm

Procedia PDF Downloads 336
25367 Clutter Suppression Based on Singular Value Decomposition and Fast Wavelet Algorithm

Authors: Ruomeng Xiao, Zhulin Zong, Longfa Yang

Abstract:

Aiming at the problem that the target signal is difficult to detect under the strong ground clutter environment, this paper proposes a clutter suppression algorithm based on the combination of singular value decomposition and the Mallat fast wavelet algorithm. The method first carries out singular value decomposition on the radar echo data matrix, realizes the initial separation of target and clutter through the threshold processing of singular value, and then carries out wavelet decomposition on the echo data to find out the target location, and adopts the discard method to select the appropriate decomposition layer to reconstruct the target signal, which ensures the minimum loss of target information while suppressing the clutter. After the verification of the measured data, the method has a significant effect on the target extraction under low SCR, and the target reconstruction can be realized without the prior position information of the target and the method also has a certain enhancement on the output SCR compared with the traditional single wavelet processing method.

Keywords: clutter suppression, singular value decomposition, wavelet transform, Mallat algorithm, low SCR

Procedia PDF Downloads 81
25366 Multiple Version of Roman Domination in Graphs

Authors: J. C. Valenzuela-Tripodoro, P. Álvarez-Ruíz, M. A. Mateos-Camacho, M. Cera

Abstract:

In 2004, it was introduced the concept of Roman domination in graphs. This concept was initially inspired and related to the defensive strategy of the Roman Empire. An undefended place is a city so that no legions are established on it, whereas a strong place is a city in which two legions are deployed. This situation may be modeled by labeling the vertices of a finite simple graph with labels {0, 1, 2}, satisfying the condition that any 0-vertex must be adjacent to, at least, a 2-vertex. Roman domination in graphs is a variant of classic domination. Clearly, the main aim is to obtain such labeling of the vertices of the graph with minimum cost, that is to say, having minimum weight (sum of all vertex labels). Formally, a function f: V (G) → {0, 1, 2} is a Roman dominating function (RDF) in the graph G = (V, E) if f(u) = 0 implies that f(v) = 2 for, at least, a vertex v which is adjacent to u. The weight of an RDF is the positive integer w(f)= ∑_(v∈V)▒〖f(v)〗. The Roman domination number, γ_R (G), is the minimum weight among all the Roman dominating functions? Obviously, the set of vertices with a positive label under an RDF f is a dominating set in the graph, and hence γ(G)≤γ_R (G). In this work, we start the study of a generalization of RDF in which we consider that any undefended place should be defended from a sudden attack by, at least, k legions. These legions can be deployed in the city or in any of its neighbours. A function f: V → {0, 1, . . . , k + 1} such that f(N[u]) ≥ k + |AN(u)| for all vertex u with f(u) < k, where AN(u) represents the set of active neighbours (i.e., with a positive label) of vertex u, is called a [k]-multiple Roman dominating functions and it is denoted by [k]-MRDF. The minimum weight of a [k]-MRDF in the graph G is the [k]-multiple Roman domination number ([k]-MRDN) of G, denoted by γ_[kR] (G). First, we prove that the [k]-multiple Roman domination decision problem is NP-complete even when restricted to bipartite and chordal graphs. A problem that had been resolved for other variants and wanted to be generalized. We know the difficulty of calculating the exact value of the [k]-MRD number, even for families of particular graphs. Here, we present several upper and lower bounds for the [k]-MRD number that permits us to estimate it with as much precision as possible. Finally, some graphs with the exact value of this parameter are characterized.

Keywords: multiple roman domination function, decision problem np-complete, bounds, exact values

Procedia PDF Downloads 70
25365 Processing Big Data: An Approach Using Feature Selection

Authors: Nikat Parveen, M. Ananthi

Abstract:

Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.

Keywords: big data, key value, feature selection, retrieval, performance

Procedia PDF Downloads 308
25364 Drag Reduction of Base Bleed at Various Flight Conditions

Authors: Man Chul Jeong, Hyoung Jin Lee, Sang Yoon Lee, Ji Hyun Park, Min Wook Chang, In-Seuck Jeung

Abstract:

This study focus on the drag reduction effect of the base bleed at supersonic flow. Base bleed is the method which bleeds the gas on the tail of the flight vehicle and reduces the base drag, which occupies over 50% of the total drag in any flight speed. Thus base bleed can reduce the total drag significantly, and enhance the total flight range. Drag reduction ratio of the base bleed is strongly related to the mass flow rate of the bleeding gas. Thus selecting appropriate mass flow rate is important. However, since the flight vehicle has various flight speed, same mass flow rate of the base bleed can have different drag reduction effect during the flight. Thus, this study investigates the effect of the drag reduction depending on the flight speed by numerical analysis using STAR-CCM+. The analysis model is 155mm diameter projectile with boat-tailed shape base. Angle of the boat-tail is chosen previously for minimum drag coefficient. Numerical analysis is conducted for Mach 2 and Mach 3, with various mass flow rate, or the injection parameter I, of the bleeding gas and the temperature of the bleeding gas, is fixed to 300K. The results showed that I=0.025 has the minimum drag at Mach 2, and I=0.014 has the minimum drag at Mach 3. Thus as the Mach number is higher, the lower mass flow rate of the base bleed has more effect on drag reduction.

Keywords: base bleed, supersonic, drag reduction, recirculation

Procedia PDF Downloads 389
25363 Autism Spectrum Disorder Classification Algorithm Using Multimodal Data Based on Graph Convolutional Network

Authors: Yuntao Liu, Lei Wang, Haoran Xia

Abstract:

Machine learning has shown extensive applications in the development of classification models for autism spectrum disorder (ASD) using neural image data. This paper proposes a fusion multi-modal classification network based on a graph neural network. First, the brain is segmented into 116 regions of interest using a medical segmentation template (AAL, Anatomical Automatic Labeling). The image features of sMRI and the signal features of fMRI are extracted, which build the node and edge embedding representations of the brain map. Then, we construct a dynamically updated brain map neural network and propose a method based on a dynamic brain map adjacency matrix update mechanism and learnable graph to further improve the accuracy of autism diagnosis and recognition results. Based on the Autism Brain Imaging Data Exchange I dataset(ABIDE I), we reached a prediction accuracy of 74% between ASD and TD subjects. Besides, to study the biomarkers that can help doctors analyze diseases and interpretability, we used the features by extracting the top five maximum and minimum ROI weights. This work provides a meaningful way for brain disorder identification.

Keywords: autism spectrum disorder, brain map, supervised machine learning, graph network, multimodal data, model interpretability

Procedia PDF Downloads 19
25362 Surface Flattening Assisted with 3D Mannequin Based on Minimum Energy

Authors: Shih-Wen Hsiao, Rong-Qi Chen, Chien-Yu Lin

Abstract:

The topic of surface flattening plays a vital role in the field of computer aided design and manufacture. Surface flattening enables the production of 2D patterns and it can be used in design and manufacturing for developing a 3D surface to a 2D platform, especially in fashion design. This study describes surface flattening based on minimum energy methods according to the property of different fabrics. Firstly, through the geometric feature of a 3D surface, the less transformed area can be flattened on a 2D platform by geodesic. Then, strain energy that has accumulated in mesh can be stably released by an approximate implicit method and revised error function. In some cases, cutting mesh to further release the energy is a common way to fix the situation and enhance the accuracy of the surface flattening, and this makes the obtained 2D pattern naturally generate significant cracks. When this methodology is applied to a 3D mannequin constructed with feature lines, it enhances the level of computer-aided fashion design. Besides, when different fabrics are applied to fashion design, it is necessary to revise the shape of a 2D pattern according to the properties of the fabric. With this model, the outline of 2D patterns can be revised by distributing the strain energy with different results according to different fabric properties. Finally, this research uses some common design cases to illustrate and verify the feasibility of this methodology.

Keywords: surface flattening, strain energy, minimum energy, approximate implicit method, fashion design

Procedia PDF Downloads 309
25361 Combined Localization, Beamforming, and Interference Threshold Estimation in Underlay Cognitive System

Authors: Omar Nasr, Yasser Naguib, Mohamed Hafez

Abstract:

This paper aims at providing an innovative solution for blind interference threshold estimation in an underlay cognitive network to be used in adaptive beamforming by secondary user Transmitter and Receiver. For the task of threshold estimation, blind detection of modulation and SNR are used. For the sake of beamforming several localization algorithms are compared to settle on best one for cognitive environment. Beamforming algorithms as LCMV (Linear Constraint Minimum Variance) and MVDR (Minimum Variance Distortion less) are also proposed and compared. The idea of just nulling the primary user after knowledge of its location is discussed against the idea of working under interference threshold.

Keywords: cognitive radio, underlay, beamforming, MUSIC, MVDR, LCMV, threshold estimation

Procedia PDF Downloads 555
25360 Assessment of Soil Quality Indicators in Rice Soil of Tamil Nadu

Authors: Kaleeswari R. K., Seevagan L .

Abstract:

Soil quality in an agroecosystem is influenced by the cropping system, water and soil fertility management. A valid soil quality index would help to assess the soil and crop management practices for desired productivity and soil health. The soil quality indices also provide an early indication of soil degradation and needy remedial and rehabilitation measures. Imbalanced fertilization and inadequate organic carbon dynamics deteriorate soil quality in an intensive cropping system. The rice soil ecosystem is different from other arable systems since rice is grown under submergence, which requires a different set of key soil attributes for enhancing soil quality and productivity. Assessment of the soil quality index involves indicator selection, indicator scoring and comprehensive score into one index. The most appropriate indicator to evaluate soil quality can be selected by establishing the minimum data set, which can be screened by linear and multiple regression factor analysis and score function. This investigation was carried out in intensive rice cultivating regions (having >1.0 lakh hectares) of Tamil Nadu viz., Thanjavur, Thiruvarur, Nagapattinam, Villupuram, Thiruvannamalai, Cuddalore and Ramanathapuram districts. In each district, intensive rice growing block was identified. In each block, two sampling grids (10 x 10 sq.km) were used with a sampling depth of 10 – 15 cm. Using GIS coordinates, and soil sampling was carried out at various locations in the study area. The number of soil sampling points were 41, 28, 28, 32, 37, 29 and 29 in Thanjavur, Thiruvarur, Nagapattinam, Cuddalore, Villupuram, Thiruvannamalai and Ramanathapuram districts, respectively. Principal Component Analysis is a data reduction tool to select some of the potential indicators. Principal Component is a linear combination of different variables that represents the maximum variance of the dataset. Principal Component that has eigenvalues equal or higher than 1.0 was taken as the minimum data set. Principal Component Analysis was used to select the representative soil quality indicators in rice soils based on factor loading values and contribution percent values. Variables having significant differences within the production system were used for the preparation of the minimum data set. Each Principal Component explained a certain amount of variation (%) in the total dataset. This percentage provided the weight for variables. The final Principal Component Analysis based soil quality equation is SQI = ∑ i=1 (W ᵢ x S ᵢ); where S- score for the subscripted variable; W-weighing factor derived from PCA. Higher index scores meant better soil quality. Soil respiration, Soil available Nitrogen and Potentially Mineralizable Nitrogen were assessed as soil quality indicators in rice soil of the Cauvery Delta zone covering Thanjavur, Thiruvavur and Nagapattinam districts. Soil available phosphorus could be used as a soil quality indicator of rice soils in the Cuddalore district. In rain-fed rice ecosystems of coastal sandy soil, DTPA – Zn could be used as an effective soil quality indicator. Among the soil parameters selected from Principal Component Analysis, Microbial Biomass Nitrogen could be used quality indicator for rice soils of the Villupuram district. Cauvery Delta zone has better SQI as compared with other intensive rice growing zone of Tamil Nadu.

Keywords: soil quality index, soil attributes, soil mapping, and rice soil

Procedia PDF Downloads 52
25359 Screening the Best Integrated Pest Management Treatments against Helicoverpa armigera

Authors: Ajmal Khan Kassi, Humayun Javed, Tariq Mukhtar

Abstract:

The research was conducted to screen out resistance and susceptibility of okra varieties against Helicoverpa armigera under field conditions 2016. In this experiment, the different management practices viz. release Trichogramma chilonis, hoeing, and weeding, clipping, and lufenuron were tested individually and with all possible combinations for the controlling of American bollworm at 3 diverse localities viz. University research farm Koont, National Agriculture Research Centre (NARC) and farmer field Taxila by using resistant variety Arka Anamika. All the treatment combinations regarding damage of shoot and fruit showed significant results. The minimum fruit infestation, i.e., 3.20% and 3.58% was recorded with combined treatment (i.e., T. chilonis + hoeing + weeding + lufenuron) in two different localities. The minimum shoot infestation, i.e., 7.18%, 7.08%, and 6.85% was also observed with (T. chilonis + hoeing + weeding + lufenuron) combined treatment at all three different localities. The above-combined treatment (T. chilonis + hoeing + weeding + lufenuron) also resulted in maximum yield at NARC and Taxila, i.e., 57.67 and 62.66 q/ha respectively. On the basis of combined treatment (i.e., T. chilonis + hoeing + weeding + lufenuron) in three different localities, Arka Anamika variety proved to be comparatively resistant against H. armigera. So this variety is recommended for the cultivation in Pothwar region to get maximum yield and minimum losses against H. armigera.

Keywords: okra, screening, combine treatment, Helicoverpa armigera

Procedia PDF Downloads 127
25358 Performance Evaluation and Economic Analysis of Minimum Quantity Lubrication with Pressurized/Non-Pressurized Air and Nanofluid Mixture

Authors: M. Amrita, R. R. Srikant, A. V. Sita Rama Raju

Abstract:

Water miscible cutting fluids are conventionally used to lubricate and cool the machining zone. But issues related to health hazards, maintenance and disposal costs have limited their usage, leading to application of Minimum Quantity Lubrication (MQL). To increase the effectiveness of MQL, nanocutting fluids are proposed. In the present work, water miscible nanographite cutting fluids of varying concentration are applied at cutting zone by two systems A and B. System A utilizes high pressure air and supplies cutting fluid at a flow rate of 1ml/min. System B uses low pressure air and supplies cutting fluid at a flow rate of 5ml/min. Their performance in machining is evaluated by measuring cutting temperatures, tool wear, cutting forces and surface roughness and compared with dry machining and flood machining. Application of nano cutting fluid using both systems showed better performance than dry machining. Cutting temperatures and cutting forces obtained by both techniques are more than flood machining. But tool wear and surface roughness showed improvement compared to flood machining. Economic analysis has been carried out in all the cases to decide the applicability of the techniques.

Keywords: economic analysis, machining, minimum quantity lubrication, nanofluid

Procedia PDF Downloads 357
25357 Optimal Number of Reconfigurable Robots in a Transport System

Authors: Mari Chaikovskaia, Jean-Philippe Gayon, Alain Quilliot

Abstract:

We consider a fleet of elementary robots that can be connected in different ways to transport loads of different types. For instance, a single robot can transport a small load, and the association of two robots can either transport a large load or two small loads. We seek to determine the optimal number of robots to transport a set of loads in a given time interval, with or without reconfiguration. We show that the problem with reconfiguration is strongly NP-hard by a reduction to the bin-packing problem. Then, we study a special case with unit capacities and derive simple formulas for the minimum number of robots, up to 3 types of loads. For this special case, we compare the minimum number of robots with or without reconfiguration and show that the gain is limited in absolute value but may be significant for small fleets.

Keywords: fleet sizing, reconfigurability, robots, transportation

Procedia PDF Downloads 56
25356 Evolution of Fluvial-Deltaic System Recorded in Accumulation of Organic Material: From the Example of the Kura River in the South Caspian Basin

Authors: Dadash Huseynov, Elmira Aliyeva, Robert Hoogendoorn, Salomon Kroonenberg

Abstract:

The study of organic material in bottom sediments together with lithologic and biostratigraphic data improves our understanding of the evolution of fluvial and deltaic systems. The modern Kura River delta is located in the Southwest Caspian Sea and is fluvial-dominated. The river distributes its sediment load through three channels oriented North-East, South-East, and South-West. The offshore modern delta consists of thinly bedded or laminated silty clays and dark grey clays. Locally sand and shell-rich horizons occur. Onshore delta is composed of channel-levee sands and floodplain silts and clays. Overall sedimentation rates in the delta determined by the 210Pb method range between 1.5-3.0 cm/yr. We investigated the distribution of organic material in the deltaic sediments in 300 samples selected from 3m deep piston cores. The studies of transparent sections demonstrate that deltaic sediments are enriched in terrestrial debris. It is non-transparent and has an irregular, isometric, or elongated shape, angular edges, black or dark-brown colour, and a clearly expressed fabric. Partially it is dissolved at the edges and is replaced by iron sulphides. Fragments of marine algae have more smooth edges, brown colour. They are transparent; the fabric is rarely preserved. The evidences of dissolution and gelification are well observed. Iron sulphides are common. The recorded third type of organic material has a round, drop-like, or oval shape and belongs to planktonic organisms. Their initial organic material is strongly transformed or replaced by dark organic compounds, probably, neoplasms. The particles are red-brown and transparent. The iron sulphides are not observed. The amount of Corg in the uppermost portion of sediments accumulated in the offshore Kura River delta varies from 0.2 to 1.22%, with median values of 0.6-0.8%. In poorly sorted sediments Corg content changes from 0.24 to 0.97% (average 0.69%), silty-sandy clay - 0.45 to 1.22% (average 0.77%), sandy-silty clay - 0.5 to 0.97% (average 0.67%), silty clay - 0.52 to 0.95% (average 0.70%). The data demonstrate that in sediments deposited during Caspian Sea high stand in 1929, the minimum of Corg content is localised near the mouth of the main south-eastern distributary channel and coincides with the minimum of the clay fraction. At the same time, the maximum of organic matter content locates near the mouth of the eastern channel, which was inactive at that time. In sediments accumulated during the last Caspian Sea low stand in 1977, the area of Corg minimum is attached to the north-eastern distributary’s mouth. It indicates the high activity of this distributary during the Caspian Sea fall. The area of Corg minimum is also recorded around the mouth of the main channel and eastern part of the delta. Maximums of Corg and clay fraction shift towards the basin. During the Caspian high stand in 1995, the minimum of Corg content is again observed in the mouth of the main south-eastern channel. The distribution of organic matter in the modern sediments of the Kura river delta displays the strong time dependence and reflects progradational-retrogradational cycles of evolution of this fluvial-deltaic system.

Keywords: high and low stands, Kura River delta, South Caspian Sea, organic matter

Procedia PDF Downloads 103
25355 Multimodal Employee Attendance Management System

Authors: Khaled Mohammed

Abstract:

This paper presents novel face recognition and identification approaches for the real-time attendance management problem in large companies/factories and government institutions. The proposed uses the Minimum Ratio (MR) approach for employee identification. Capturing the authentic face variability from a sequence of video frames has been considered for the recognition of faces and resulted in system robustness against the variability of facial features. Experimental results indicated an improvement in the performance of the proposed system compared to the Previous approaches at a rate between 2% to 5%. In addition, it decreased the time two times if compared with the Previous techniques, such as Extreme Learning Machine (ELM) & Multi-Scale Structural Similarity index (MS-SSIM). Finally, it achieved an accuracy of 99%.

Keywords: attendance management system, face detection and recognition, live face recognition, minimum ratio

Procedia PDF Downloads 130