Search results for: fuzzy aggregation operator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1389

Search results for: fuzzy aggregation operator

279 Movement Optimization of Robotic Arm Movement Using Soft Computing

Authors: V. K. Banga

Abstract:

Robots are now playing a very promising role in industries. Robots are commonly used in applications in repeated operations or where operation by human is either risky or not feasible. In most of the industrial applications, robotic arm manipulators are widely used. Robotic arm manipulator with two link or three link structures is commonly used due to their low degrees-of-freedom (DOF) movement. As the DOF of robotic arm increased, complexity increases. Instrumentation involved with robotics plays very important role in order to interact with outer environment. In this work, optimal control for movement of various DOFs of robotic arm using various soft computing techniques has been presented. We have discussed about different robotic structures having various DOF robotics arm movement. Further stress is on kinematics of the arm structures i.e. forward kinematics and inverse kinematics. Trajectory planning of robotic arms using soft computing techniques is demonstrating the flexibility of this technique. The performance is optimized for all possible input values and results in optimized movement as resultant output. In conclusion, soft computing has been playing very important role for achieving optimized movement of robotic arm. It also requires very limited knowledge of the system to implement soft computing techniques.

Keywords: artificial intelligence, kinematics, robotic arm, neural networks, fuzzy logic

Procedia PDF Downloads 272
278 The Multiple Sclerosis condition and the Role of Varicella-zoster virus in its Progression

Authors: Sina Mahdavi, Mahdi Asghari Ozma

Abstract:

Multiple sclerosis (MS) is the most common inflammatory autoimmune disease of the CNS that affects the myelination process in the central nervous system (CNS). Complex interactions of various "environmental or infectious" factors may act as triggers in autoimmunity and disease progression. The association between viral infections, especially human Varicella-zoster virus (VZV) and MS is one potential cause that is not well understood. This study aims to summarize the available data on VZV retrovirus infection in MS disease progression. For this study, the keywords "Multiple sclerosis", " Human Varicella-zoster virus ", and "central nervous system" in the databases PubMed, Google Scholar, Sid, and MagIran between 2016 and 2022 were searched and 14 articles were chosen, studied, and analyzed. Analysis of the amino acid sequences of HNRNPA1 with VZV proteins has shown a 62% amino acid sequence similarity between VZV gE and the PrLD/M9 epitope region (TNPO1 binding domain) of mutant HNRNPA1. A heterogeneous nuclear ribonucleoprotein (hnRNP), which is produced by HNRNPA1, is involved in the processing and transfer of mRNA and pre-mRNA. Mutant HNRNPA1 mimics gE of VZV as an antigen that leads to autoantibody production. Mutant HnRNPA1 translocates to the cytoplasm, after aggregation is presented by MHC class I, followed by CD8 + cells. Of these, antibodies and immune cells against the gE epitopes of VZV remain due to the memory immune response, causing neurodegeneration and the development of MS in genetically predisposed individuals. VZV expression during the course of MS is present in genetically predisposed individuals with HNRNPA1 mutation, suggesting a link between VZV and MS, and that this virus may play a role in the development of MS by inducing an inflammatory state. Therefore, measures to modulate VZV expression may be effective in reducing inflammatory processes in demyelinated areas of MS patients in genetically predisposed individuals.

Keywords: multiple sclerosis, varicella-zoster virus, central nervous system, autoimmunity

Procedia PDF Downloads 58
277 Review and Analysis of Parkinson's Tremor Genesis Using Mathematical Model

Authors: Pawan Kumar Gupta, Sumana Ghosh

Abstract:

Parkinson's Disease (PD) is a long-term neurodegenerative movement disorder of the central nervous system with vast symptoms related to the motor system. The common symptoms of PD are tremor, rigidity, bradykinesia/akinesia, and postural instability, but the clinical symptom includes other motor and non‐motor issues. The motor symptoms of the disease are consequence of death of the neurons in a region of the midbrain known as substantia nigra pars compacta, leading to decreased level of a neurotransmitter known as dopamine. The cause of this neuron death is not clearly known but involves formation of Lewy bodies, an abnormal aggregation or clumping of the protein alpha-synuclein in the neurons. Unfortunately, there is no cure for PD, and the management of this disease is challenging. Therefore, it is critical for a patient to be diagnosed at early stages. A limited choice of drugs is available to improve the symptoms, but those become less and less effective over time. Apart from that, with rapid growth in the field of science and technology, other methods such as multi-area brain stimulation are used to treat patients. In order to develop advanced techniques and to support drug development for treating PD patients, an accurate mathematical model is needed to explain the underlying relationship of dopamine secretion in the brain with the hand tremors. There has been a lot of effort in the past few decades on modeling PD tremors and treatment effects from a computational point of view. These models can effectively save time as well as the cost of drug development for the pharmaceutical industry and be helpful for selecting appropriate treatment mechanisms among all possible options. In this review paper, an effort is made to investigate studies on PD modeling and analysis and to highlight some of the key advances in the field over the past centuries with discussion on the current challenges.

Keywords: Parkinson's disease, deep brain stimulation, tremor, modeling

Procedia PDF Downloads 123
276 DGA Data Interpretation Using Extension Theory for Power Transformer Diagnostics

Authors: O. P. Rahi, Manoj Kumar

Abstract:

Power transformers are essential and expensive equipments in electrical power system. Dissolved gas analysis (DGA) is one of the most useful techniques to detect incipient faults in power transformers. However, the identification of the faulted location by conventional method is not always an easy task due to variability of gas data and operational variables. In this paper, an extension theory based power transformer fault diagnosis method is presented. Extension theory tries to solve contradictions and incompatibility problems. This paper first briefly introduces the basic concept of matter element theory, establishes the matter element models for three-ratio method, and then briefly discusses extension set theory. Detailed analysis is carried out on the extended relation function (ERF) adopted in this paper for transformer fault diagnosis. The detailed diagnosing steps are offered. Simulation proves that the proposed method can overcome the drawbacks of the conventional three-ratio method, such as no matching and failure to diagnose multi-fault. It enhances diagnosing accuracy.

Keywords: DGA, extension theory, ERF, fault diagnosis power transformers, fault diagnosis, fuzzy logic

Procedia PDF Downloads 385
275 Maximizing Profit Using Optimal Control by Exploiting the Flexibility in Thermal Power Plants

Authors: Daud Mustafa Minhas, Raja Rehan Khalid, Georg Frey

Abstract:

The next generation power systems are equipped with abundantly available free renewable energy resources (RES). During their low-cost operations, the price of electricity significantly reduces to a lower value, and sometimes it becomes negative. Therefore, it is recommended not to operate the traditional power plants (e.g. coal power plants) and to reduce the losses. In fact, it is not a cost-effective solution, because these power plants exhibit some shutdown and startup costs. Moreover, they require certain time for shutdown and also need enough pause before starting up again, increasing inefficiency in the whole power network. Hence, there is always a trade-off between avoiding negative electricity prices, and the startup costs of power plants. To exploit this trade-off and to increase the profit of a power plant, two main contributions are made: 1) introducing retrofit technology for state of art coal power plant; 2) proposing optimal control strategy for a power plant by exploiting different flexibility features. These flexibility features include: improving ramp rate of power plant, reducing startup time and lowering minimum load. While, the control strategy is solved as mixed integer linear programming (MILP), ensuring optimal solution for the profit maximization problem. Extensive comparisons are made considering pre and post-retrofit coal power plant having the same efficiencies under different electricity price scenarios. It concludes that if the power plant must remain in the market (providing services), more flexibility reflects direct economic advantage to the plant operator.

Keywords: discrete optimization, power plant flexibility, profit maximization, unit commitment model

Procedia PDF Downloads 123
274 A Deep Learning Approach to Real Time and Robust Vehicular Traffic Prediction

Authors: Bikis Muhammed, Sehra Sedigh Sarvestani, Ali R. Hurson, Lasanthi Gamage

Abstract:

Vehicular traffic events have overly complex spatial correlations and temporal interdependencies and are also influenced by environmental events such as weather conditions. To capture these spatial and temporal interdependencies and make more realistic vehicular traffic predictions, graph neural networks (GNN) based traffic prediction models have been extensively utilized due to their capability of capturing non-Euclidean spatial correlation very effectively. However, most of the already existing GNN-based traffic prediction models have some limitations during learning complex and dynamic spatial and temporal patterns due to the following missing factors. First, most GNN-based traffic prediction models have used static distance or sometimes haversine distance mechanisms between spatially separated traffic observations to estimate spatial correlation. Secondly, most GNN-based traffic prediction models have not incorporated environmental events that have a major impact on the normal traffic states. Finally, most of the GNN-based models did not use an attention mechanism to focus on only important traffic observations. The objective of this paper is to study and make real-time vehicular traffic predictions while incorporating the effect of weather conditions. To fill the previously mentioned gaps, our prediction model uses a real-time driving distance between sensors to build a distance matrix or spatial adjacency matrix and capture spatial correlation. In addition, our prediction model considers the effect of six types of weather conditions and has an attention mechanism in both spatial and temporal data aggregation. Our prediction model efficiently captures the spatial and temporal correlation between traffic events, and it relies on the graph attention network (GAT) and Bidirectional bidirectional long short-term memory (Bi-LSTM) plus attention layers and is called GAT-BILSTMA.

Keywords: deep learning, real time prediction, GAT, Bi-LSTM, attention

Procedia PDF Downloads 54
273 Geospatial Modeling of Dry Snow Avalanches Distribution Using Geographic Information Systems and Remote Sensing: A Case Study of the Šar Mountains (Balkan Peninsula)

Authors: Uroš Durlević, Ivan Novković, Nina Čegar, Stefanija Stojković

Abstract:

Snow avalanches represent one of the most dangerous natural phenomena in mountain regions worldwide. Material and human casualties caused by snow avalanches can be very significant. In this study, using geographic information systems and remote sensing, the natural conditions of the Šar Mountains were analyzed for geospatial modeling of dry slab avalanches. For this purpose, the Fuzzy Analytic Hierarchy Process (FAHP) multi-criteria analysis method was used, within which fifteen environmental criteria were analyzed and evaluated. Based on the existing analyzes and results, it was determined that a significant area of the Šar Mountains is very highly susceptible to the occurrence of dry slab avalanches. The obtained data can be of significant use to local governments, emergency services, and other institutions that deal with natural disasters at the local level. To our best knowledge, this is one of the first research in the Republic of Serbia that uses the FAHP method for geospatial modeling of dry slab avalanches.

Keywords: GIS, FAHP, Šar Mountains, snow avalanches, environmental protection

Procedia PDF Downloads 71
272 Robotic Assisted vs Traditional Laparoscopic Partial Nephrectomy Peri-Operative Outcomes: A Comparative Single Surgeon Study

Authors: Gerard Bray, Derek Mao, Arya Bahadori, Sachinka Ranasinghe

Abstract:

The EAU currently recommends partial nephrectomy as the preferred management for localised cT1 renal tumours, irrespective of surgical approach. With the advent of robotic assisted partial nephrectomy, there is growing evidence that warm ischaemia time may be reduced compared to the traditional laparoscopic approach. There is still no clear differences between the two approaches with regards to other peri-operative and oncological outcomes. Current limitations in the field denote the lack of single surgeon series to compare the two approaches as other studies often include multiple operators of different experience levels. To the best of our knowledge, this study is the first single surgeon series comparing peri-operative outcomes of robotic assisted and laparoscopic PN. The current study aims to reduce intra-operator bias while maintaining an adequate sample size to assess the differences in outcomes between the two approaches. We retrospectively compared patient demographics, peri-operative outcomes, and renal function derangements of all partial nephrectomies undertaken by a single surgeon with experience in both laparoscopic and robotic surgery. Warm ischaemia time, length of stay, and acute renal function deterioration were all significantly reduced with robotic partial nephrectomy, compared to laparoscopic nephrectomy. This study highlights the benefits of robotic partial nephrectomy. Further prospective studies with larger sample sizes would be valuable additions to the current literature.

Keywords: partial nephrectomy, robotic assisted partial nephrectomy, warm ischaemia time, peri-operative outcomes

Procedia PDF Downloads 122
271 Autohydrolysis Treatment of Olive Cake to Extract Fructose and Sucrose

Authors: G. Blázquez, A. Gálvez-Pérez, M. Calero, I. Iáñez-Rodríguez, M. A. Martín-Lara, A. Pérez

Abstract:

The production of olive oil is considered as one of the most important agri-food industries. However, some of the by-products generated in the process are potential pollutants and cause environmental problems. Consequently, the management of these by-products is currently considered as a challenge for the olive oil industry. In this context, several technologies have been developed and tested. In this sense, the autohydrolysis of these by-products could be considered as a promising technique. Therefore, this study focused on autohydrolysis treatments of a solid residue from the olive oil industry denominated olive cake. This one comes from the olive pomace extraction with hexane. Firstly, a water washing was carried out to eliminate the water soluble compounds. Then, an experimental design was developed for the autohydrolysis experiments carried out in the hydrothermal pressure reactor. The studied variables were temperature (30, 60 and 90 ºC) and time (30, 60, 90 min). On the other hand, aliquots of liquid obtained fractions were analysed by HPLC to determine the fructose and sucrose contents present in the liquid fraction. Finally, the obtained results of sugars contents and the yields of the different experiments were fitted to a neuro-fuzzy and to a polynomial model.

Keywords: ANFIS, olive cake, polyols, saccharides

Procedia PDF Downloads 127
270 Radiation Protection Assessment of the Emission of a d-t Neutron Generator: Simulations with MCNP Code and Experimental Measurements in Different Operating Conditions

Authors: G. M. Contessa, L. Lepore, G. Gandolfo, C. Poggi, N. Cherubini, R. Remetti, S. Sandri

Abstract:

Practical guidelines are provided in this work for the safe use of a portable d-t Thermo Scientific MP-320 neutron generator producing pulsed 14.1 MeV neutron beams. The neutron generator’s emission was tested experimentally and reproduced by MCNPX Monte Carlo code. Simulations were particularly accurate, even generator’s internal components were reproduced on the basis of ad-hoc collected X-ray radiographic images. Measurement campaigns were conducted under different standard experimental conditions using an LB 6411 neutron detector properly calibrated at three different energies, and comparing simulated and experimental data. In order to estimate the dose to the operator vs. the operating conditions and the energy spectrum, the most appropriate value of the conversion factor between neutron fluence and ambient dose equivalent has been identified, taking into account both direct and scattered components. The results of the simulations show that, in real situations, when there is no information about the neutron spectrum at the point where the dose has to be evaluated, it is possible - and in any case conservative - to convert the measured value of the count rate by means of the conversion factor corresponding to 14 MeV energy. This outcome has a general value when using this type of generator, enabling a more accurate design of experimental activities in different setups. The increasingly widespread use of this type of device for industrial and medical applications makes the results of this work of interest in different situations, especially as a support for the definition of appropriate radiation protection procedures and, in general, for risk analysis.

Keywords: instrumentation and monitoring, management of radiological safety, measurement of individual dose, radiation protection of workers

Procedia PDF Downloads 113
269 Microencapsulation of Tuna Oil and Mentha Piperita Oil Mixture using Different Combinations of Wall Materials with Whey Protein Isolate

Authors: Amr Mohamed Bakry Ibrahim, Yingzhou Ni, Hao Cheng, Li Liang

Abstract:

Tuna oil (omega-3 oil) has become increasingly popular in the last ten years, because it is considered one of the treasures of food which has many beneficial health effects for the humans. Nevertheless, the susceptibility of omega-3 oils to oxidative deterioration, resulting in the formation of oxidation products, in addition to organoleptic problems including “fishy” flavors, have presented obstacles to the more widespread use of tuna oils in the food industry. This study sought to evaluate the potential impact of Mentha piperita oil on physicochemical characteristics and oxidative stability of tuna oil microcapsules formed by spray drying using the partial substitution to whey protein isolate by carboxymethyl cellulose and pullulan. The emulsions before the drying process were characterized regarding size and ζ-potential, viscosity, surface tension. Confocal laser scanning microscopy showed that all emulsions were sphericity and homogeneous distribution without any visible particle aggregation. The microcapsules obtained after spray drying were characterized regarding microencapsulation efficiency, water activity, color, bulk density, flowability, scanning surface morphology and oxidative stability. The microcapsules were spherical shape had low water activity (0.11-0.23 aw). The microcapsules containing both tuna oil and Mentha piperita oil were smaller than others and addition of pullulan into wall materials improved the morphology of microcapsules. Microencapsulation efficiency of powdered oil ranged from 90% to 94%. Using Mentha piperita oil in the process of microencapsulation tuna oil enhanced the oxidative stability using whey protein isolate only or with carboxymethyl cellulose or pullulan as wall materials, resulting in improved storage stability and mask fishy odor. Therefore, it is foreseen using tuna-Mentha piperita oil mixture microcapsules in the applications of the food industries.

Keywords: Mentha piperita oil, microcapsule, tuna oil, whey protein isolate

Procedia PDF Downloads 331
268 Examining the Effects of College Education on Democratic Attitudes in China: A Regression Discontinuity Analysis

Authors: Gang Wang

Abstract:

Education is widely believed to be a prerequisite for democracy and civil society, but the causal link between education and outcome variables is usually hardly to be identified. This study applies a fuzzy regression discontinuity design to examine the effects of college education on democratic attitudes in the Chinese context. In the analysis treatment assignment is determined by students’ college entry years and thus naturally selected by subjects’ ages. Using a sample of Chinese college students collected in Beijing in 2009, this study finds that college education actually reduces undergraduates’ motivation for political development in China but promotes political loyalty to the authoritarian government. Further hypotheses tests explain these interesting findings from two perspectives. The first is related to the complexity of politics. As college students progress over time, they increasingly realize the complexity of political reform in China’s authoritarian regime and rather stay away from politics. The second is related to students’ career opportunities. As students are close to graduation, they are immersed with job hunting and have a reduced interest in political freedom.

Keywords: china, college education, democratic attitudes, regression discontinuity

Procedia PDF Downloads 335
267 Organizational Climate being Knowledge Sharing Oriented: A Fuzzy-Set Analysis

Authors: Paulo Lopes Henriques, Carla Curado

Abstract:

According to literature, knowledge sharing behaviors are influenced by organizational values and structures, namely organizational climate. The manuscript examines the antecedents of the knowledge sharing oriented organizational climate. According to theoretical expectations the study adopts the following explanatory conditions: knowledge sharing costs, knowledge sharing incentives, perceptions of knowledge sharing contributing to performance and tenure. The study confronts results considering two groups of firms: nondigital (firms without intranet) vs digital (firms with intranet). The paper applies fsQCA technique to analyze data by using fsQCA 2.5 software (www.fsqca.com) testing several conditional arguments to explain the outcome variable. Main results strengthen claims on the relevancy of the contribution of knowledge sharing to performance. Secondly, evidence brings tenure - an explanatory condition that is associated to organizational memory – to the spotlight. The study provides an original contribution not previously addressed in literature, since it identifies the sufficient conditions sets to knowledge sharing oriented organizational climate using fsQCA, which is, to our knowledge, a novel application of the technique.

Keywords: fsQCA, knowledge sharing oriented organizational climate, knowledge sharing costs, knowledge sharing incentives

Procedia PDF Downloads 305
266 A Review on Comparative Analysis of Path Planning and Collision Avoidance Algorithms

Authors: Divya Agarwal, Pushpendra S. Bharti

Abstract:

Autonomous mobile robots (AMR) are expected as smart tools for operations in every automation industry. Path planning and obstacle avoidance is the backbone of AMR as robots have to reach their goal location avoiding obstacles while traversing through optimized path defined according to some criteria such as distance, time or energy. Path planning can be classified into global and local path planning where environmental information is known and unknown/partially known, respectively. A number of sensors are used for data collection. A number of algorithms such as artificial potential field (APF), rapidly exploring random trees (RRT), bidirectional RRT, Fuzzy approach, Purepursuit, A* algorithm, vector field histogram (VFH) and modified local path planning algorithm, etc. have been used in the last three decades for path planning and obstacle avoidance for AMR. This paper makes an attempt to review some of the path planning and obstacle avoidance algorithms used in the field of AMR. The review includes comparative analysis of simulation and mathematical computations of path planning and obstacle avoidance algorithms using MATLAB 2018a. From the review, it could be concluded that different algorithms may complete the same task (i.e. with a different set of instructions) in less or more time, space, effort, etc.

Keywords: path planning, obstacle avoidance, autonomous mobile robots, algorithms

Procedia PDF Downloads 209
265 Feature Weighting Comparison Based on Clustering Centers in the Detection of Diabetic Retinopathy

Authors: Kemal Polat

Abstract:

In this paper, three feature weighting methods have been used to improve the classification performance of diabetic retinopathy (DR). To classify the diabetic retinopathy, features extracted from the output of several retinal image processing algorithms, such as image-level, lesion-specific and anatomical components, have been used and fed them into the classifier algorithms. The dataset used in this study has been taken from University of California, Irvine (UCI) machine learning repository. Feature weighting methods including the fuzzy c-means clustering based feature weighting, subtractive clustering based feature weighting, and Gaussian mixture clustering based feature weighting, have been used and compered with each other in the classification of DR. After feature weighting, five different classifier algorithms comprising multi-layer perceptron (MLP), k- nearest neighbor (k-NN), decision tree, support vector machine (SVM), and Naïve Bayes have been used. The hybrid method based on combination of subtractive clustering based feature weighting and decision tree classifier has been obtained the classification accuracy of 100% in the screening of DR. These results have demonstrated that the proposed hybrid scheme is very promising in the medical data set classification.

Keywords: machine learning, data weighting, classification, data mining

Procedia PDF Downloads 310
264 Toxicological Effects of Heavy Metals; Copper, Lead and Chromium on Brain and Liver Tissue of Grass Carp (Ctenopharyngodon idella)

Authors: Ahsan Khan, Nazish Shah, Muhammad Salman

Abstract:

The present study deals with the toxicological effects of copper, lead and chromium on brain and liver tissues of grass carp (Ctenopharyngodon idella). The average length of experimental fish was 8.5 ± 5.5 cm and weighed 9.5 ± 6.5 g. Grass carp was exposed to lethal concentration (LC₁₅) of copper, lead and chromium for 24, 48, 72 and 96 hours respectively. (LC₁₅) for copper was 1.5, 1.4, 1.2 and 1mgL⁻¹. Similarly, LC₁₅ of lead was 250, 235, 225 and 216mgL⁻¹ while (LC₁₅) for chromium was 25.5, 22.5, 20 and 18mgL⁻¹ respectively. During the time of exposure against various doses of heavy metals the grass carp showed some behavioral changes. In the initial stages of experiment, the rapid movements and gulping of air were observed. Several times the fish tried to jump to scat from the toxic median. In addition, the accumulation of heavy metals in different tissues of grass carp particularly in liver and brain tissues were observed. Lead was highly accumulated in brain tissue after the exposure of fish for 24 and 48 hours, while highly accumulated in liver tissues after the exposure of fish for 72 and 96 hours. Chromium was highly accumulated in the liver tissues after the exposure of fish for 24 hours while its accumulation was found highly in the brain tissues after the exposure of fish for 48, 72 and 96 hours. Similarly, accumulation of copper concentration was found highly in brain tissues after the exposure of 48 and 96 hours while its accumulation was high in liver tissues after the exposure of 24 and 72 hours. Comparatively maximum accumulation of lead was found in brain and liver tissues of grass carp followed by chromium and copper. Furthermore, accumulation of these metals caused many abnormalities like gliosis, destruction of cell, change in cell shape and shrinkage of cells in brain tissue while in liver tissues aggregation in hepatocytes, widen space between cells and also destruction of cell was observed. These experiments and observations can be useful to monitor the aquatic pollution and quality of aquatic environment system.

Keywords: brain, grass carp, liver, lethal concentration, toxicity

Procedia PDF Downloads 136
263 A Computational Framework for Decoding Hierarchical Interlocking Structures with SL Blocks

Authors: Yuxi Liu, Boris Belousov, Mehrzad Esmaeili Charkhab, Oliver Tessmann

Abstract:

This paper presents a computational solution for designing reconfigurable interlocking structures that are fully assembled with SL Blocks. Formed by S-shaped and L-shaped tetracubes, SL Block is a specific type of interlocking puzzle. Analogous to molecular self-assembly, the aggregation of SL blocks will build a reversible hierarchical and discrete system where a single module can be numerously replicated to compose semi-interlocking components that further align, wrap, and braid around each other to form complex high-order aggregations. These aggregations can be disassembled and reassembled, responding dynamically to design inputs and changes with a unique capacity for reconfiguration. To use these aggregations as architectural structures, we developed computational tools that automate the configuration of SL blocks based on architectural design objectives. There are three critical phases in our work. First, we revisit the hierarchy of the SL block system and devise a top-down-type design strategy. From this, we propose two key questions: 1) How to translate 3D polyominoes into SL block assembly? 2) How to decompose the desired voxelized shapes into a set of 3D polyominoes with interlocking joints? These two questions can be considered the Hamiltonian path problem and the 3D polyomino tiling problem. Then, we derive our solution to each of them based on two methods. The first method is to construct the optimal closed path from an undirected graph built from the voxelized shape and translate the node sequence of the resulting path into the assembly sequence of SL blocks. The second approach describes interlocking relationships of 3D polyominoes as a joint connection graph. Lastly, we formulate the desired shapes and leverage our methods to achieve their reconfiguration within different levels. We show that our computational strategy will facilitate the efficient design of hierarchical interlocking structures with a self-replicating geometric module.

Keywords: computational design, SL-blocks, 3D polyomino puzzle, combinatorial problem

Procedia PDF Downloads 113
262 Functional Neural Network for Decision Processing: A Racing Network of Programmable Neurons Where the Operating Model Is the Network Itself

Authors: Frederic Jumelle, Kelvin So, Didan Deng

Abstract:

In this paper, we are introducing a model of artificial general intelligence (AGI), the functional neural network (FNN), for modeling human decision-making processes. The FNN is composed of multiple artificial mirror neurons (AMN) racing in the network. Each AMN has a similar structure programmed independently by the users and composed of an intention wheel, a motor core, and a sensory core racing at a specific velocity. The mathematics of the node’s formulation and the racing mechanism of multiple nodes in the network will be discussed, and the group decision process with fuzzy logic and the transformation of these conceptual methods into practical methods of simulation and in operations will be developed. Eventually, we will describe some possible future research directions in the fields of finance, education, and medicine, including the opportunity to design an intelligent learning agent with application in AGI. We believe that FNN has a promising potential to transform the way we can compute decision-making and lead to a new generation of AI chips for seamless human-machine interactions (HMI).

Keywords: neural computing, human machine interation, artificial general intelligence, decision processing

Procedia PDF Downloads 104
261 Numerical and Sensitivity Analysis of Modeling the Newcastle Disease Dynamics

Authors: Nurudeen Oluwasola Lasisi

Abstract:

Newcastle disease is a highly contagious disease of birds caused by a para-myxo virus. In this paper, we presented Novel quarantine-adjusted incident and linear incident of Newcastle disease model equations. We considered the dynamics of transmission and control of Newcastle disease. The existence and uniqueness of the solutions were obtained. The existence of disease-free points was shown, and the model threshold parameter was examined using the next-generation operator method. The sensitivity analysis was carried out in order to identify the most sensitive parameters of the disease transmission. This revealed that as parameters β,ω, and ᴧ increase while keeping other parameters constant, the effective reproduction number R_ev increases. This implies that the parameters increase the endemicity of the infection of individuals. More so, when the parameters μ,ε,γ,δ_1, and α increase, while keeping other parameters constant, the effective reproduction number R_ev decreases. This implies the parameters decrease the endemicity of the infection as they have negative indices. Analytical results were numerically verified by the Differential Transformation Method (DTM) and quantitative views of the model equations were showcased. We established that as contact rate (β) increases, the effective reproduction number R_ev increases, as the effectiveness of drug usage increases, the R_ev decreases and as the quarantined individual decreases, the R_ev decreases. The results of the simulations showed that the infected individual increases when the susceptible person approaches zero, also the vaccination individual increases when the infected individual decreases and simultaneously increases the recovery individual.

Keywords: disease-free equilibrium, effective reproduction number, endemicity, Newcastle disease model, numerical, Sensitivity analysis

Procedia PDF Downloads 27
260 Information Literacy among Faculty and Students of Medical Colleges of Haryana, Punjab and Chandigarh

Authors: Sanjeev Sharma, Suman Lata

Abstract:

With the availability of diverse printed, electronic literature and web sites on medical and health related information, it is impossible for the medical professional to get the information he seeks in the shortest possible time. For all these problems information literacy is the only solution. Thus, information literacy is recognized as an important aspect of medical education. In the present study, an attempt has been made to know the information literacy skills of the faculty and students at medical colleges of Haryana, Punjab and Chandigarh. The scope of the study was confined to the 12 selected medical colleges of three States (Haryana, Punjab, and Chandigarh). The findings of the study were based on the data collected through 1018 questionnaires filled by the respondents of the medical colleges. It was found that Online Medical Websites (such as WebMD, eMedicine and Mayo Clinic etc.) were frequently used by 63.43% of the respondents of Chandigarh which is slightly more than Haryana (61%) and Punjab (55.65%). As well, 30.86% of the respondents of Chandigarh, 27.41% of Haryana and 27.05% of Punjab were familiar with the controlled vocabulary tool; 25.14% respondents of Chandigarh, 23.80% of Punjab, 23.17% of Haryana were familiar with the Boolean operators; 33.05% of the respondents of Punjab, 28.19% of Haryana and 25.14% of Chandigarh were familiar with the use and importance of the keywords while searching an electronic database; and 51.43% of the respondents of Chandigarh, 44.52% of Punjab and 36.29% of Haryana were able to make effective use of the retrieved information. For accessing information in electronic format, 47.74% of the respondents rated their skills high, while the majority of respondents (76.13%) were unfamiliar with the basic search technique i.e. Boolean operator used for searching information in an online database. On the basis of the findings, it was suggested that a comprehensive training program based on medical professionals information needs should be organized frequently. Furthermore, it was also suggested that information literacy may be included as a subject in the health science curriculum so as to make the medical professionals information literate and independent lifelong learners.

Keywords: information, information literacy, medical professionals, medical colleges

Procedia PDF Downloads 130
259 Existence of Minimal and Maximal Mild Solutions for Non-Local in Time Subdiffusion Equations of Neutral Type

Authors: Jorge Gonzalez-Camus

Abstract:

In this work is proved the existence of at least one minimal and maximal mild solutions to the Cauchy problem, for fractional evolution equation of neutral type, involving a general kernel. An operator A generating a resolvent family and integral resolvent family on a Banach space X and a kernel belonging to a large class appears in the equation, which covers many relevant cases from physics applications, in particular, the important case of time - fractional evolution equations of neutral type. The main tool used in this work was the Kuratowski measure of noncompactness and fixed point theorems, specifically Darbo-type, and an iterative method of lower and upper solutions, based in an order in X induced by a normal cone P. Initially, the equation is a Cauchy problem, involving a fractional derivate in Caputo sense. Then, is formulated the equivalent integral version, and defining a convenient functional, using the theory of resolvent families, and verifying the hypothesis of the fixed point theorem of Darbo type, give us the existence of mild solution for the initial problem. Furthermore, the existence of minimal and maximal mild solutions was proved through in an iterative method of lower and upper solutions, using the Azcoli-Arzela Theorem, and the Gronwall’s inequality. Finally, we recovered the case derivate in Caputo sense.

Keywords: fractional evolution equations, Volterra integral equations, minimal and maximal mild solutions, neutral type equations, non-local in time equations

Procedia PDF Downloads 149
258 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 263
257 Effect of Humic Acids on Agricultural Soil Structure and Stability and Its Implication on Soil Quality

Authors: Omkar Gaonkar, Indumathi Nambi, Suresh G. Kumar

Abstract:

The functional and morphological aspects of soil structure determine the soil quality. The dispersion of colloidal soil particles, especially the clay fraction and rupture of soil aggregates, both of which play an important role in soil structure development, lead to degradation of soil quality. The main objective of this work was to determine the effect of the behaviour of soil colloids on the agricultural soil structure and quality. The effect of commercial humic acid and soil natural organic matter on the electrical and structural properties of the soil colloids was also studied. Agricultural soil, belonging to the sandy loam texture class from northern part of India was considered in this study. In order to understand the changes in the soil quality in the presence and absence of humic acids, the soil fabric and structure was analyzed by X-ray diffraction (XRD), Fourier Transform Infrared (FTIR) Spectroscopy and Scanning Electron Microscopy (SEM). Electrical properties of natural soil colloids in aqueous suspensions were assessed by zeta potential measurements at varying pH values with and without the presence of humic acids. The influence of natural organic matter was analyzed by oxidizing the natural soil organic matter with hydrogen peroxide. The zeta potential of the soil colloids was found to be negative in the pH range studied. The results indicated that hydrogen peroxide treatment leads to deflocculation of colloidal soil particles. In addition, the humic acids undergoes effective adsorption onto the soil surface imparting more negative zeta potential to the colloidal soil particles. The soil hydrophilicity decreased in the presence of humic acids which was confirmed by surface free energy determination. Thus, it can be concluded that the presence of humic acids altered the soil fabric and structure, thereby affecting the soil quality. This study assumes significance in understanding soil aggregation and the interactions at soil solid-liquid interface.

Keywords: humic acids, natural organic matter, zeta potential, soil quality

Procedia PDF Downloads 220
256 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis

Authors: Alexander A. Tokmakov

Abstract:

Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.

Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins

Procedia PDF Downloads 391
255 Brine Waste from Seawater Desalination in Malaysia

Authors: Cynthia Mahadi, Norhafezah Kasmuri

Abstract:

Water scarcity is a growing issue these days. As a result, saltwater is being considered a limitless supply of fresh water through the desalination process, which is likely to address the worldwide water crisis, including in Malaysia. This study aims to offer the best management practice for controlling brine discharge in Malaysia by comparing environmental regulations on brine waste management in other countries. Then, a survey was distributed to the public to acquire further information about their level of awareness of the harmful effects of brine waste and to find out their perspective on the proposed solutions to ensure the effectiveness of the measures. As a result, it has been revealed that Malaysia still lacks regulations regarding the disposal of brine waste. Thus, a recommendation based on practices in other nations has been put forth by this study. This study suggests that the government and Malaysia's environmental regulatory body should govern brine waste disposal in the Environmental Quality Act 1974. Also, to add the construction of a desalination plant in Schedule 1 of prescribed activities was necessary. Because desalination plants can harm the environment during both construction and operation, every proposal for the construction of a desalination plant should involve the submission of an environmental impact assessment (EIA).

Keywords: seawater desalination, brine waste, environmental impact assessment, fuzzy Delphi method

Procedia PDF Downloads 55
254 Iris Recognition Based on the Low Order Norms of Gradient Components

Authors: Iman A. Saad, Loay E. George

Abstract:

Iris pattern is an important biological feature of human body; it becomes very hot topic in both research and practical applications. In this paper, an algorithm is proposed for iris recognition and a simple, efficient and fast method is introduced to extract a set of discriminatory features using first order gradient operator applied on grayscale images. The gradient based features are robust, up to certain extents, against the variations may occur in contrast or brightness of iris image samples; the variations are mostly occur due lightening differences and camera changes. At first, the iris region is located, after that it is remapped to a rectangular area of size 360x60 pixels. Also, a new method is proposed for detecting eyelash and eyelid points; it depends on making image statistical analysis, to mark the eyelash and eyelid as a noise points. In order to cover the features localization (variation), the rectangular iris image is partitioned into N overlapped sub-images (blocks); then from each block a set of different average directional gradient densities values is calculated to be used as texture features vector. The applied gradient operators are taken along the horizontal, vertical and diagonal directions. The low order norms of gradient components were used to establish the feature vector. Euclidean distance based classifier was used as a matching metric for determining the degree of similarity between the features vector extracted from the tested iris image and template features vectors stored in the database. Experimental tests were performed using 2639 iris images from CASIA V4-Interival database, the attained recognition accuracy has reached up to 99.92%.

Keywords: iris recognition, contrast stretching, gradient features, texture features, Euclidean metric

Procedia PDF Downloads 311
253 Kýklos Dimensional Geometry: Entity Specific Core Measurement System

Authors: Steven D. P Moore

Abstract:

A novel method referred to asKýklos(Ky) dimensional geometry is proposed as an entity specific core geometric dimensional measurement system. Ky geometric measures can constructscaled multi-dimensionalmodels using regular and irregular sets in IRn. This entity specific-derived geometric measurement system shares similar fractal methods in which a ‘fractal transformation operator’ is applied to a set S to produce a union of N copies. The Kýklos’ inputs use 1D geometry as a core measure. One-dimensional inputs include the radius interval of a circle/sphere or the semiminor/semimajor axes intervals of an ellipse or spheroid. These geometric inputs have finite values that can be measured by SI distance units. The outputs for each interval are divided and subdivided 1D subcomponents with a union equal to the interval geometry/length. Setting a limit of subdivision iterations creates a finite value for each 1Dsubcomponent. The uniqueness of this method is captured by allowing the simplest 1D inputs to define entity specific subclass geometric core measurements that can also be used to derive length measures. Current methodologies for celestial based measurement of time, as defined within SI units, fits within this methodology, thus combining spatial and temporal features into geometric core measures. The novel Ky method discussed here offers geometric measures to construct scaled multi-dimensional structures, even models. Ky classes proposed for consideration include celestial even subatomic. The application of this offers incredible possibilities, for example, geometric architecture that can represent scaled celestial models that incorporates planets (spheroids) and celestial motion (elliptical orbits).

Keywords: Kyklos, geometry, measurement, celestial, dimension

Procedia PDF Downloads 151
252 Investigating the Effect of the Pedagogical Agent on Visual Attention in Attention Deficit Hyperactivity Disorder Students

Authors: Nasrin Mohammadhasani, Rosa Angela Fabio

Abstract:

The attention to relevance information is the key element for learning. Otherwise, Attention Deficit Hyperactivity Disorder (ADHD) students have a fuzzy visual pattern that prevents them to attention and remember learning subject. The present study aimed to test the hypothesis that the presence of a pedagogical agent can effectively support ADHD learner's attention and learning outcomes in a multimedia learning environment. The learning environment was integrated with a pedagogical agent, named Koosha as a social peer. This study employed a pretest and posttest experimental design with control group. The statistical population was 30 boys students, age 10-11 with ADHD that randomly assigned to learn with/without an agent in well designed environment for mathematic. The results suggested that experimental and control groups show a significant difference in time when they participated and mathematics achievement. According to this research, using the pedagogical agent can enhance learning of ADHD students by gaining and guiding their attention to relevance information part on display, so it can be considered as asocial cue that provides theme cognitive supports.

Keywords: attention, computer assisted instruction, multimedia learning environment, pedagogical agent

Procedia PDF Downloads 287
251 Coupling Static Multiple Light Scattering Technique With the Hansen Approach to Optimize Dispersibility and Stability of Particle Dispersions

Authors: Guillaume Lemahieu, Matthias Sentis, Giovanni Brambilla, Gérard Meunier

Abstract:

Static Multiple Light Scattering (SMLS) has been shown to be a straightforward technique for the characterization of colloidal dispersions without dilution, as multiply scattered light in backscattered and transmitted mode is directly related to the concentration and size of scatterers present in the sample. In this view, the use of SMLS for stability measurement of various dispersion types has already been widely described in the literature. Indeed, starting from a homogeneous dispersion, the variation of backscattered or transmitted light can be attributed to destabilization phenomena, such as migration (sedimentation, creaming) or particle size variation (flocculation, aggregation). In a view to investigating more on the dispersibility of colloidal suspensions, an experimental set-up for “at the line” SMLS experiment has been developed to understand the impact of the formulation parameters on particle size and dispersibility. The SMLS experiment is performed with a high acquisition rate (up to 10 measurements per second), without dilution, and under direct agitation. Using such experimental device, SMLS detection can be combined with the Hansen approach to optimize the dispersing and stabilizing properties of TiO₂ particles. It appears that the dispersibility and the stability spheres generated are clearly separated, arguing that lower stability is not necessarily a consequence of poor dispersibility. Beyond this clarification, this combined SMLS-Hansen approach is a major step toward the optimization of dispersibility and stability of colloidal formulations by finding solvents having the best compromise between dispersing and stabilizing properties. Such study can be intended to find better dispersion media, greener and cheaper solvents to optimize particles suspensions, reduce the content of costly stabilizing additives or satisfy product regulatory requirements evolution in various industrial fields using suspensions (paints & inks, coatings, cosmetics, energy).

Keywords: dispersibility, stability, Hansen parameters, particles, solvents

Procedia PDF Downloads 80
250 GSM Based Smart Patient Monitoring System

Authors: Ayman M. Mansour

Abstract:

In this paper, we propose an intelligent system that is used for monitoring the health conditions of Patients. Monitoring the health condition of Patients is a complex problem that involves different medical units and requires continuous monitoring especially in rural areas because of inadequate number of available specialized physicians. The proposed system will Improve patient care and drive costs down comparing to the existing system in Jordan. The proposed system will be the start point to Faster and improve the communication between different units in the health system in Jordan. Connecting patients and their physicians beyond hospital doors regarding their geographical area is an important issue in developing the health system in Jordan. The propose system will provide an intelligent system that will generate initial diagnosing to the patient case. This will assist and advice clinicians at the point of care. The decision is based on demographic data and laboratory test results of patient data. Using such system with the ability of making medical decisions, the quality of medical care in Jordan and specifically in Tafial is expected to be improved. This will provide more accurate, effective, and reliable diagnoses and treatments especially if the physicians have insufficient knowledge.

Keywords: GSM, SMS, patient, monitoring system, fuzzy logic, multi-agent system

Procedia PDF Downloads 547