Search results for: rough sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1429

Search results for: rough sets

1399 Enhancement in Bactericidal Activity of Hydantoin Based Microsphere from Smooth to Rough

Authors: Rajani Kant Rai, Jayakrishnan Athipet

Abstract:

There have been several attempts to prepare polymers with antimicrobial properties by doping with various N-halamines. Hydantoins (Cyclic N-halamine) is of importance due to their stability rechargeable chloroamide function, broad-spectrum anti-microbial action and ability to prevent resistance to the organisms. Polymerizable hydantoins are synthesized by tethering vinyl moieties to 5,5,-dialkyl hydantoin sacrificing the imide hydrogen in the molecule thereby restricting the halogen capture only to the amide nitrogen that results in compromised antibacterial activity. In order to increase the activity of the antimicrobial polymer, we have developed a scheme to maximize the attachment of chlorine to the amide and the imide moieties of hydantoin. Vinyl hydantoin monomer, (Z)-5-(4-((3-methylbuta-1,3-dien-2-yl)oxy)benzylidene)imidazolidine-2,4-dione (MBBID) was synthesized and copolymerized with a commercially available monomer, methyl methacrylate, by free radical polymerization. The antimicrobial activity of hydantoin is strongly dependent on their surface area and hence their microbial activity increases when incorporated in microspheres or nanoparticles as compared to their bulk counterpart. In this regard, smooth and rough surface microsphere of the vinyl monomer (MBBID) with commercial monomer was synthesized. The oxidative chlorine content of the copolymer ranged from 1.5 to 2.45 %. Further, to demonstrate the water purification potential, the thin column was packed with smooth or rough microspheres and challenged with simulated contaminated water that exhibited 6 log kill (total kill) of the bacteria in 20 minutes of exposure with smooth (25 mg/ml) and rough microsphere (15.0 mg/ml).

Keywords: cyclic N-halamine, vinyl hydantoin monomer, rough surface microsphere, simulated contaminated water

Procedia PDF Downloads 145
1398 Variation of Streamwise and Vertical Turbulence Intensity in a Smooth and Rough Bed Open Channel Flow

Authors: M. Abdullah Al Faruque, Ram Balachandar

Abstract:

An experimental study with four different types of bed conditions was carried out to understand the effect of roughness in open channel flow at two different Reynolds numbers. The bed conditions include a smooth surface and three different roughness conditions which were generated using sand grains with a median diameter of 2.46 mm. The three rough conditions include a surface with distributed roughness, a surface with continuously distributed roughness and a sand bed with a permeable interface. A commercial two-component fibre-optic LDA system was used to conduct the velocity measurements. The variables of interest include the mean velocity, turbulence intensity, the correlation between the streamwise and the wall normal turbulence, Reynolds shear stress and velocity triple products. Quadrant decomposition was used to extract the magnitude of the Reynolds shear stress of the turbulent bursting events. The effect of roughness was evident throughout the flow depth. The results show that distributed roughness has the greatest roughness effect followed by the sand bed and the continuous roughness. Compared to the smooth bed, the streamwise turbulence intensity reduces but the vertical turbulence intensity increases at a location very close to the bed due to the introduction of roughness. Although the same sand grain is used to create the three different rough bed conditions, the difference in the turbulence intensity is an indication that the specific geometry of the roughness has an influence on turbulence structure.

Keywords: open channel flow, smooth and rough bed, Reynolds number, turbulence

Procedia PDF Downloads 340
1397 A Fuzzy Nonlinear Regression Model for Interval Type-2 Fuzzy Sets

Authors: O. Poleshchuk, E. Komarov

Abstract:

This paper presents a regression model for interval type-2 fuzzy sets based on the least squares estimation technique. Unknown coefficients are assumed to be triangular fuzzy numbers. The basic idea is to determine aggregation intervals for type-1 fuzzy sets, membership functions of whose are low membership function and upper membership function of interval type-2 fuzzy set. These aggregation intervals were called weighted intervals. Low and upper membership functions of input and output interval type-2 fuzzy sets for developed regression models are considered as piecewise linear functions.

Keywords: interval type-2 fuzzy sets, fuzzy regression, weighted interval

Procedia PDF Downloads 373
1396 Effects of Substrate Roughness on E-Cadherin Junction of Oral Keratinocytes

Authors: Sungpyo Kim, Changseok Oh, Ga-Young Lee, Hyun-Man Kim

Abstract:

Intercellular junction of keratinocytes is crucial for epithelia to build an epithelial barrier. Junctional epithelium (JE) seals the interfaces between tooth and gingival tissue. Keratinocytes of JE attach to surfaces roughened by abrasion or erosion with aging. Thus behavior of oral keratinocytes on the rough substrates may help understand the epithelial seal of JE of which major intercellular junction is E-cadherin junction (ECJ). The present study investigated the influence of various substrate roughnesses on the development of ECJ between normal human gingival epithelial keratinocytes, HOK-16B cells. HOK-16B cells were slow in the development of ECJ on the rough substrates compared to on the smooth substrates. Furthermore, oral keratinocytes on the substrates of higher roughnesses were delayed in the development of E-cadherin junction than on the substrates of lower roughnesses. Delayed development of E-cadherin junction on the rough substrates was ascribed to the impaired spreading of cells and its higher JNK activity. Cells on the smooth substrates rapidly spread wide cytoplasmic extensions around cells. However, cells on the rough substrates slowly extended narrow cytoplasmic extensions of which number was limited due to the substrate irregularity. As these cytoplasmic extensions formed ECJ when met with the extensions of neighboring cells, thus, the present study demonstrated that a limited chance of contacts between cytoplasmic extensions due to the limited number of cytoplasmic extensions and slow development of cytoplasmic extensions brought about a delayed development of ECJ in oral keratinocytes on the rougher substrates. Sealing between cells was not complete because only part of cell membrane contributes to the formation of intercellular junction between cells on the substrates of higher roughnesses. Interestingly, inhibition of JNK activity promoted the development of ECJ on the rough substrates, of which mechanism remains to be studied further.

Keywords: substrate roughness, E-cadherin junction, oral keratinocyte, cell spreading, JNK

Procedia PDF Downloads 383
1395 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct

Procedia PDF Downloads 225
1394 Effect of Reynolds Number on Wall-normal Turbulence Intensity in a Smooth and Rough Open Channel Using both Outer and Inner Scaling

Authors: Md Abdullah Al Faruque, Ram Balachandar

Abstract:

Sudden change of bed condition is frequent in open channel flow. Change of bed condition affects the turbulence characteristics in both streamwise and wall-normal direction. Understanding the turbulence intensity in open channel flow is of vital importance to the modeling of sediment transport and resuspension, bed formation, entrainment, and the exchange of energy and momentum. A comprehensive study was carried out to understand the extent of the effect of Reynolds number and bed roughness on different turbulence characteristics in an open channel flow. Four different bed conditions (impervious smooth bed, impervious continuous rough bed, pervious rough sand bed, and impervious distributed roughness) and two different Reynolds numbers were adopted for this cause. The effect of bed roughness on different turbulence characteristics is seen to be prevalent for most of the flow depth. Effect of Reynolds number on different turbulence characteristics is also evident for flow over different bed, but the extent varies on bed condition. Although the same sand grain is used to create the different rough bed conditions, the difference in turbulence characteristics is an indication that specific geometry of the roughness has an influence on turbulence characteristics. Roughness increases the contribution of the extreme turbulent events which produces very large instantaneous Reynolds shear stress and can potentially influence the sediment transport, resuspension of pollutant from bed and alter the nutrient composition, which eventually affect the sustainability of benthic organisms.

Keywords: open channel flow, Reynolds Number, roughness, turbulence

Procedia PDF Downloads 400
1393 Thermo-Mechanical Processing of Armor Steel Plates

Authors: Taher El-Bitar, Maha El-Meligy, Eman El-Shenawy, Almosilhy Almosilhy, Nader Dawood

Abstract:

The steel contains 0.3% C and 0.004% B, beside Mn, Cr, Mo, and Ni. The alloy was processed by using 20-ton capacity electric arc furnace (EAF), and then refined by ladle furnace (LF). Liquid steel was cast as rectangular ingots. Dilatation test showed the critical transformation temperatures Ac1, Ac3, Ms and Mf as 716, 835, 356, and 218 °C. The ingots were austenitized and soaked and then rough rolled to thin slabs with 80 mm thickness. The thin slabs were then reheated and soaked for finish rolling to 6.0 mm thickness plates. During the rough rolling, the roll force increases as a result of rolling at temperatures less than recrystallization temperature. However, during finish rolling, the steel reflects initially continuous static recrystallization after which it shows strain hardening due to fall of temperature. It was concluded that, the steel plates were successfully heat treated by quenching-tempering at 250 ºC for 20 min.

Keywords: armor steel, austenitizing, critical transformation temperatures (CTTs), dilatation curve, martensite, quenching, rough and finish rolling processes, soaking, tempering, thermo-mechanical processing

Procedia PDF Downloads 347
1392 Impact of Surface Roughness on Light Absorption

Authors: V. Gareyan, Zh. Gevorkian

Abstract:

We study oblique incident light absorption in opaque media with rough surfaces. An analytical approach with modified boundary conditions taking into account the surface roughness in metallic or dielectric films has been discussed. Our approach reveals interference-linked terms that modify the absorption dependence on different characteristics. We have discussed the limits of our approach that hold valid from the visible to the microwave region. Polarization and angular dependences of roughness-induced absorption are revealed. The existence of an incident angle or a wavelength for which the absorptance of a rough surface becomes equal to that of a flat surface is predicted. Based on this phenomenon, a method of determining roughness correlation length is suggested.

Keywords: light, absorption, surface, roughness

Procedia PDF Downloads 54
1391 A Note on the Fractal Dimension of Mandelbrot Set and Julia Sets in Misiurewicz Points

Authors: O. Boussoufi, K. Lamrini Uahabi, M. Atounti

Abstract:

The main purpose of this paper is to calculate the fractal dimension of some Julia Sets and Mandelbrot Set in the Misiurewicz Points. Using Matlab to generate the Julia Sets images that match the Misiurewicz points and using a Fractal software, we were able to find different measures that characterize those fractals in textures and other features. We are actually focusing on fractal dimension and the error calculated by the software. When executing the given equation of regression or the log-log slope of image a Box Counting method is applied to the entire image, and chosen settings are available in a FracLAc Program. Finally, a comparison is done for each image corresponding to the area (boundary) where Misiurewicz Point is located.

Keywords: box counting, FracLac, fractal dimension, Julia Sets, Mandelbrot Set, Misiurewicz Points

Procedia PDF Downloads 216
1390 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence

Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno

Abstract:

Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.

Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index

Procedia PDF Downloads 168
1389 Voting Representation in Social Networks Using Rough Set Techniques

Authors: Yasser F. Hassan

Abstract:

Social networking involves use of an online platform or website that enables people to communicate, usually for a social purpose, through a variety of services, most of which are web-based and offer opportunities for people to interact over the internet, e.g. via e-mail and ‘instant messaging’, by analyzing the voting behavior and ratings of judges in a popular comments in social networks. While most of the party literature omits the electorate, this paper presents a model where elites and parties are emergent consequences of the behavior and preferences of voters. The research in artificial intelligence and psychology has provided powerful illustrations of the way in which the emergence of intelligent behavior depends on the development of representational structure. As opposed to the classical voting system (one person – one decision – one vote) a new voting system is designed where agents with opposed preferences are endowed with a given number of votes to freely distribute them among some issues. The paper uses ideas from machine learning, artificial intelligence and soft computing to provide a model of the development of voting system response in a simulated agent. The modeled development process involves (simulated) processes of evolution, learning and representation development. The main value of the model is that it provides an illustration of how simple learning processes may lead to the formation of structure. We employ agent-based computer simulation to demonstrate the formation and interaction of coalitions that arise from individual voter preferences. We are interested in coordinating the local behavior of individual agents to provide an appropriate system-level behavior.

Keywords: voting system, rough sets, multi-agent, social networks, emergence, power indices

Procedia PDF Downloads 393
1388 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory

Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi

Abstract:

One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.

Keywords: rough set theory, attribute reduction, fuzzy logic, memetic algorithms, record to record algorithm, great deluge algorithm

Procedia PDF Downloads 454
1387 Diabetes Diagnosis Model Using Rough Set and K- Nearest Neighbor Classifier

Authors: Usiobaifo Agharese Rosemary, Osaseri Roseline Oghogho

Abstract:

Diabetes is a complex group of disease with a variety of causes; it is a disorder of the body metabolism in the digestion of carbohydrates food. The application of machine learning in the field of medical diagnosis has been the focus of many researchers and the use of recognition and classification model as a decision support tools has help the medical expert in diagnosis of diseases. Considering the large volume of medical data which require special techniques, experience, and high diagnostic skill in the diagnosis of diseases, the application of an artificial intelligent system to assist medical personnel in order to enhance their efficiency and accuracy in diagnosis will be an invaluable tool. In this study will propose a diabetes diagnosis model using rough set and K-nearest Neighbor classifier algorithm. The system consists of two modules: the feature extraction module and predictor module, rough data set is used to preprocess the attributes while K-nearest neighbor classifier is used to classify the given data. The dataset used for this model was taken for University of Benin Teaching Hospital (UBTH) database. Half of the data was used in the training while the other half was used in testing the system. The proposed model was able to achieve over 80% accuracy.

Keywords: classifier algorithm, diabetes, diagnostic model, machine learning

Procedia PDF Downloads 336
1386 Efficient Recommendation System for Frequent and High Utility Itemsets over Incremental Datasets

Authors: J. K. Kavitha, D. Manjula, U. Kanimozhi

Abstract:

Mining frequent and high utility item sets have gained much significance in the recent years. When the data arrives sporadically, incremental and interactive rule mining and utility mining approaches can be adopted to handle user’s dynamic environmental needs and avoid redundancies, using previous data structures, and mining results. The dependence on recommendation systems has exponentially risen since the advent of search engines. This paper proposes a model for building a recommendation system that suggests frequent and high utility item sets over dynamic datasets for a cluster based location prediction strategy to predict user’s trajectories using the Efficient Incremental Rule Mining (EIRM) algorithm and the Fast Update Utility Pattern Tree (FUUP) algorithm. Through comprehensive evaluations by experiments, this scheme has shown to deliver excellent performance.

Keywords: data sets, recommendation system, utility item sets, frequent item sets mining

Procedia PDF Downloads 293
1385 Building 1-Well-Covered Graphs by Corona, Join, and Rooted Product of Graphs

Authors: Vadim E. Levit, Eugen Mandrescu

Abstract:

A graph is well-covered if all its maximal independent sets are of the same size. A well-covered graph is 1-well-covered if deletion of every vertex of the graph leaves it well-covered. It is known that a graph without isolated vertices is 1-well-covered if and only if every two disjoint independent sets are included in two disjoint maximum independent sets. Well-covered graphs are related to combinatorial commutative algebra (e.g., every Cohen-Macaulay graph is well-covered, while each Gorenstein graph without isolated vertices is 1-well-covered). Our intent is to construct several infinite families of 1-well-covered graphs using the following known graph operations: corona, join, and rooted product of graphs. Adopting some known techniques used to advantage for well-covered graphs, one can prove that: if the graph G has no isolated vertices, then the corona of G and H is 1-well-covered if and only if H is a complete graph of order two at least; the join of the graphs G and H is 1-well-covered if and only if G and H have the same independence number and both are 1-well-covered; if H satisfies the property that every three pairwise disjoint independent sets are included in three pairwise disjoint maximum independent sets, then the rooted product of G and H is 1-well-covered, for every graph G. These findings show not only how to generate some more families of 1-well-covered graphs, but also that, to this aim, sometimes, one may use graphs that are not necessarily 1-well-covered.

Keywords: maximum independent set, corona, concatenation, join, well-covered graph

Procedia PDF Downloads 208
1384 LEGO Bricks and Creativity: A Comparison between Classic and Single Sets

Authors: Maheen Zia

Abstract:

Near the early twenty-first century, LEGO decided to diversify its product range which resulted in more specific and single-outcome sets occupying the store shelves than classic kits having fairly all-purpose bricks. Earlier, LEGOs came with more bricks and lesser instructions. Today, there are more single kits being produced and sold, which come with a strictly defined set of guidelines. If one set is used to make a car, the same bricks cannot be put together to produce any other article. Earlier, multiple bricks gave children a chance to be imaginative, think of new items and construct them (by just putting the same pieces differently). The new products are less open-ended and offer a limited possibility for players in both designing and realizing those designs. The article reviews (in the light of existing research) how classic LEGO sets could help enhance a child’s creativity in comparison with single sets, which allow a player to interact (not experiment) with the bricks.

Keywords: constructive play, creativity, LEGO, play-based learning

Procedia PDF Downloads 188
1383 Adhesive Connections in Timber: A Comparison between Rough and Smooth Wood Bonding Surfaces

Authors: Valentina Di Maria, Anton Ianakiev

Abstract:

The use of adhesive anchors for wooden constructions is an efficient technology to connect and design timber members in new timber structures and to rehabilitate the damaged structural members of historical buildings. Due to the lack of standard regulation in this specific area of structural design, designers’ choices are still supported by test analysis that enables knowledge, and the prediction, of the structural behavior of glued in rod joints. The paper outlines an experimental research activity aimed at identifying the tensile resistance capacity of several new adhesive joint prototypes made of epoxy resin, steel bar and timber, Oak and Douglas Fir species. The development of new adhesive connectors has been carried out by using epoxy to glue stainless steel bars into pre-drilled holes, characterized by smooth and rough internal surfaces, in timber samples. The realization of a threaded contact surface using a specific drill bit has led to an improved bond between wood and epoxy. The applied changes have also reduced the cost of the joints’ production. The paper presents the results of this parametric analysis and a Finite Element analysis that enables identification and study of the internal stress distribution in the proposed adhesive anchors.

Keywords: glued in rod joints, adhesive anchors, timber, epoxy, rough contact surface, threaded hole shape

Procedia PDF Downloads 551
1382 The Contact Behaviors of Seals Under Combined Normal and Tangential Loading: A Multiscale Finite Element Contact Analysis

Authors: Runliang Wang, Jianhua Liu, Duo Jia, Xiaoyu Ding

Abstract:

The contact between sealing surfaces plays a vital role in guaranteeing the sealing performance of various seals. To date, analyses of sealing structures have rarely considered both structural parameters (macroscale) and surface roughness information (microscale) of sealing surfaces due to the complex modeling process. Meanwhile, most of the contact analyses applied to seals were conducted only under normal loading, which still existssome distance from real loading conditions in engineering. In this paper, a multiscale rough contact model, which took both macrostructural parameters of seals and surface roughness information of sealing surfaces into consideration for the cone-cone seal, was established. By using the finite element method (FEM), the combined normal and tangential loading was applied to the model to simulate the assembly process of the cone-cone seal. The evolution of the contact behaviors during the assembly process, such as the real contact area (RCA), the distribution of contact pressure, and contact status, are studied in detail. The results showed the non-linear relationship between the RCA and the load, which was different from the normal loading cases. In addition, the evolution of the real contact area of cone-cone seals with isotropic and anisotropic rough surfaces are also compared quantitatively.

Keywords: contact mechanics, FEM, randomly rough surface, real contact area, sealing

Procedia PDF Downloads 183
1381 Diversity and Ecological Analysis of Vascular Epiphytes in Gera Wild Coffee Forest, Jimma Zone of Oromia Regional State, Ethiopia

Authors: Bedilu Tafesse

Abstract:

The diversity and ecological analysis of vascular epiphytes was studied in Gera Forest in southwestern Ethiopia at altitudes between 1600 and 2400 m.a.s.l. A total area of 4.5 ha was surveyed in coffee and non-coffee forest vegetation. Fifty sampling plots, each 30 m x 30 m (900 m2), were used for the purpose of data collection. A total of 59 species of vascular epiphytes were recorded, of which 34 (59%) were holo epiphytes, two (4%) were hemi epiphytes and 22 (37%) species were accidental vascular epiphytes. To study the altitudinal distribution of vascular epiphytes, altitudes were classified into higher >2000, middle 1800-2000 and lower 1600-1800 m.a.s.l. According to Shannon-Wiener Index (H/= 3.411) of alpha diversity the epiphyte community in the study area is medium. There was a statistically significant difference between host bark type and epiphyte richness as determined by one-way ANOVA p = 0.001 < 0.05. The post-hoc test shows that there is significant difference of vascular epiphytes richness between smooth bark with rough, flack and corky bark (P =0.001< 0.05), as well as rough and cork bark (p =0.43 <0.05). However, between rough and flack bark (p = 0.753 > 0.05) and between flack and corky bark (p = 0.854 > 0.05) no significant difference of epiphyte abundance was observed. Rough bark had 38%, corky 26%, flack 25%, and only 11% vascular epiphytes abundance occurred on smooth bark. The regression correlation test, (R2 = 0.773, p = 0.0001 < 0.05), showed that the number of species of vascular epiphytes and host DBH size are positively correlated. The regression correlation test (R2 = 0.28, p = 0.0001 < 0.05), showed that the number of species and host tree height positively correlated. The host tree preference of vascular epiphytes was recorded for only Vittaria volkensii species hosted on Syzygium guineense trees. The result of similarity analysis indicated that Gera Forest showed the highest vascular epiphytic similarity (0.35) with Yayu Forest and shared the least vascular epiphytic similarity (0.295) with Harenna Forest. It was concluded that horizontal stems and branches, large and rough, flack and corky bark type trees are more suitable for vascular epiphytes seedling attachments and growth. Conservation and protection of these phorophytes are important for the survival of vascular epiphytes and increase their ecological importance.

Keywords: accidental epiphytes, hemiepiphyte, holoepiphyte, phorophyte

Procedia PDF Downloads 332
1380 The Impact of the Length of Time Spent on the Street on Adjustment to Homelessness

Authors: Jakub Marek, Marie Vagnerova, Ladislav Csemy

Abstract:

Background: The length of time spent on the street influences the degree of adjustment to homelessness. Over the years spent sleeping rough, homeless people gradually lose the ability to control their lives and their return to mainstream society becomes less and less likely. Goals: The aim of the study was to discover whether and how men who have been sleeping rough for more than ten years differ from those who have been homeless for four years or less. Methods: The research was based on a narrative analysis of in-depth interviews focused on the respondent’s entire life story, i.e. their childhood, adolescence, and the period of adulthood preceding homelessness. It also asked the respondents about how they envisaged the future. The group under examination comprised 51 homeless men aged 37 – 54. The first subgroup contained 29 men who have been sleeping rough for 10 – 21 years, the second group contained 22 men who have been homeless for four years or less. Results: Men who have been sleeping rough for more than ten years had problems adapting as children. They grew up in a problematic family or in an institution and acquired only a rudimentary education. From the start they had problems at work, found it difficult to apply themselves, and found it difficult to hold down a job. They tend to have high-risk personality traits and often a personality disorder. Early in life they had problems with alcohol or drugs and their relationships were unsuccessful. If they have children, they do not look after them. They are reckless even in respect of the law and often commit crime. They usually ended up on the street in their thirties. Most of this subgroup of homeless people lack motivation and the will to make any fundamental change to their lives. They identify with the homeless community and have no other contacts. Men who have been sleeping rough for four years or less form two subgroups. There are those who had a normal childhood, attended school and found work. They started a family but began to drink, and as a consequence lost their family and their job. Such men end up on the street between the ages of 35 and 40. And then there are men who become homeless after the age of 40 because of an inability to cope with a difficult situation, e.g. divorce or indebtedness. They are not substance abusers and do not have a criminal record. Such people can be offered effective assistance to return to mainstream society by the social services because they have not yet fully self-identified with the homeless community and most of them have retained the necessary abilities and skills. Conclusion: The length of time a person has been homeless is an important factor in respect of social prevention. It is clear that the longer a person is homeless, the worse are their chances of being reintegrated into mainstream society.

Keywords: risk factors, homelessness, chronicity, narrative analysis

Procedia PDF Downloads 172
1379 Effect of Robot Configuration Parameters, Masses and Friction on Painlevé Paradox for a Sliding Two-Link (P-R) Robot

Authors: Hassan Mohammad Alkomy, Hesham Elkaranshawy, Ahmed Ibrahim Ashour, Khaled Tawfik Mohamed

Abstract:

For a rigid body sliding on a rough surface, a range of uncertainty or non-uniqueness of solution could be found, which is termed: Painlevé paradox. Painlevé paradox is the reason of a wide range of bouncing motion, observed during sliding of robotic manipulators on rough surfaces. In this research work, the existence of the paradox zone during the sliding motion of a two-link (P-R) robotic manipulator with a unilateral constraint is investigated. Parametric study is performed to investigate the effect of friction, link-length ratio, total height and link-mass ratio on the paradox zone.

Keywords: dynamical system, friction, multibody system, painlevé paradox, robotic systems, sliding robots, unilateral constraint

Procedia PDF Downloads 454
1378 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 574
1377 The Analysis of Split Graphs in Social Networks Based on the k-Cardinality Assignment Problem

Authors: Ivan Belik

Abstract:

In terms of social networks split graphs correspond to the variety of interpersonal and intergroup relations. In this paper we analyse the interaction between the cliques (socially strong and trusty groups) and the independent sets (fragmented and non-connected groups of people) as the basic components of any split graph. Based on the Semi-Lagrangean relaxation for the k-cardinality assignment problem we show the way of how to minimize the socially risky interactions between the cliques and the independent sets within the social network.

Keywords: cliques, independent sets, k-cardinality assignment, social networks, split graphs

Procedia PDF Downloads 320
1376 A Method for Quantitative Assessment of the Dependencies between Input Signals and Output Indicators in Production Systems

Authors: Maciej Zaręba, Sławomir Lasota

Abstract:

Knowing the degree of dependencies between the sets of input signals and selected sets of indicators that measure a production system's effectiveness is of great importance in the industry. This paper introduces the SELM method that enables the selection of sets of input signals, which affects the most the selected subset of indicators that measures the effectiveness of a production system. For defined set of output indicators, the method quantifies the impact of input signals that are gathered in the continuous monitoring production system.

Keywords: manufacturing operation management, signal relationship, continuous monitoring, production systems

Procedia PDF Downloads 119
1375 Perception of Tactile Stimuli in Children with Autism Spectrum Disorder

Authors: Kseniya Gladun

Abstract:

Tactile stimulation of a dorsal side of the wrist can have a strong impact on our attitude toward physical objects such as pleasant and unpleasant impact. This study explored different aspects of tactile perception to investigate atypical touch sensitivity in children with autism spectrum disorder (ASD). This study included 40 children with ASD and 40 healthy children aged 5 to 9 years. We recorded rsEEG (sampling rate of 250 Hz) during 20 min using EEG amplifier “Encephalan” (Medicom MTD, Taganrog, Russian Federation) with 19 AgCl electrodes placed according to the International 10–20 System. The electrodes placed on the left, and right mastoids served as joint references under unipolar montage. The registration of EEG v19 assignments was carried out: frontal (Fp1-Fp2; F3-F4), temporal anterior (T3-T4), temporal posterior (T5-T6), parietal (P3-P4), occipital (O1-O2). Subjects were passively touched by 4 types of tactile stimuli on the left wrist. Our stimuli were presented with a velocity of about 3–5 cm per sec. The stimuli materials and procedure were chosen for being the most "pleasant," "rough," "prickly" and "recognizable". Type of tactile stimulation: Soft cosmetic brush - "pleasant" , Rough shoe brush - "rough", Wartenberg pin wheel roller - "prickly", and the cognitive tactile stimulation included letters by finger (most of the patient’s name ) "recognizable". To designate the moments of the stimuli onset-offset, we marked the moment when the moment of the touch began and ended; the stimulation was manual, and synchronization was not precise enough for event-related measures. EEG epochs were cleaned from eye movements by ICA-based algorithm in EEGLAB plugin for MatLab 7.11.0 (Mathwork Inc.). Muscle artifacts were cut out by manual data inspection. The response to tactile stimuli was significantly different in the group of children with ASD and healthy children, which was also depended on type of tactile stimuli and the severity of ASD. Amplitude of Alpha rhythm increased in parietal region to response for only pleasant stimulus, for another type of stimulus ("rough," "thorny", "recognizable") distinction of amplitude was not observed. Correlation dimension D2 was higher in healthy children compared to children with ASD (main effect ANOVA). In ASD group D2 was lower for pleasant and unpleasant compared to the background in the right parietal area. Hilbert transform changes in the frequency of the theta rhythm found only for a rough tactile stimulation compared with healthy participants only in the right parietal area. Children with autism spectrum disorders and healthy children were responded to tactile stimulation differently with specific frequency distribution alpha and theta band in the right parietal area. Thus, our data supports the hypothesis that rsEEG may serve as a sensitive index of altered neural activity caused by ASD. Children with autism have difficulty in distinguishing the emotional stimuli ("pleasant," "rough," "prickly" and "recognizable").

Keywords: autism, tactile stimulation, Hilbert transform, pediatric electroencephalography

Procedia PDF Downloads 250
1374 Path-Tracking Controller for Tracked Mobile Robot on Rough Terrain

Authors: Toshifumi Hiramatsu, Satoshi Morita, Manuel Pencelli, Marta Niccolini, Matteo Ragaglia, Alfredo Argiolas

Abstract:

Automation technologies for agriculture field are needed to promote labor-saving. One of the most relevant problems in automated agriculture is represented by controlling the robot along a predetermined path in presence of rough terrain or incline ground. Unfortunately, disturbances originating from interaction with the ground, such as slipping, make it quite difficult to achieve the required accuracy. In general, it is required to move within 5-10 cm accuracy with respect to the predetermined path. Moreover, lateral velocity caused by gravity on the incline field also affects slipping. In this paper, a path-tracking controller for tracked mobile robots moving on rough terrains of incline field such as vineyard is presented. The controller is composed of a disturbance observer and an adaptive controller based on the kinematic model of the robot. The disturbance observer measures the difference between the measured and the reference yaw rate and linear velocity in order to estimate slip. Then, the adaptive controller adapts “virtual” parameter of the kinematics model: Instantaneous Centers of Rotation (ICRs). Finally, target angular velocity reference is computed according to the adapted parameter. This solution allows estimating the effects of slip without making the model too complex. Finally, the effectiveness of the proposed solution is tested in a simulation environment.

Keywords: the agricultural robot, autonomous control, path-tracking control, tracked mobile robot

Procedia PDF Downloads 172
1373 Genodata: The Human Genome Variation Using BigData

Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta

Abstract:

Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.

Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop

Procedia PDF Downloads 259
1372 The Future of Reduced Instruction Set Computing and Complex Instruction Set Computing and Suggestions for Reduced Instruction Set Computing-V Development

Authors: Can Xiao, Ouanhong Jiang

Abstract:

Based on the two instruction sets of complex instruction set computing (CISC) and reduced instruction set computing (RISC), processors developed in their respective “expertise” fields. This paper will summarize research on the differences in performance and energy efficiency between CISC and RISC and strive to eliminate the influence of peripheral configuration factors. We will discuss whether processor performance is centered around instruction sets or implementation. In addition, the rapidly developing RISC-V poses a challenge to existing models. We will analyze research results, analyze the impact of instruction sets themselves, and finally make suggestions for the development of RISC-V.

Keywords: ISA, RISC-V, ARM, X86, power, energy efficiency

Procedia PDF Downloads 89
1371 REDUCER: An Architectural Design Pattern for Reducing Large and Noisy Data Sets

Authors: Apkar Salatian

Abstract:

To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article, we also show how REDUCER has successfully been applied to 3 different case studies.

Keywords: design pattern, filtering, compression, architectural design

Procedia PDF Downloads 212
1370 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 394