Search results for: computational form finding
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9800

Search results for: computational form finding

9710 Graph-Based Semantical Extractive Text Analysis

Authors: Mina Samizadeh

Abstract:

In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.

Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis

Procedia PDF Downloads 60
9709 Exergy Model for a Solar Water Heater with Flat Plate Collector

Authors: P. Sathyakala, G. Sai Sundara Krishnan

Abstract:

The objective of this paper is to derive an exergy model for a solar water heater with honey comb structure in order to identify the element which has larger irreversibility in the system. This will help us in finding the means to reduce the wasted work potential so that the overall efficiency of the system can be improved by finding the ways to reduce those wastages.

Keywords: exergy, energy balance, entropy balance, work potential, degradation, honey comb, flat plate collector

Procedia PDF Downloads 469
9708 Rethinking Peace Journalism in Pakistan: A Critical Analysis of News Discourse on the Afghan Refugee Repatriation Conflict

Authors: Ayesha Hasan

Abstract:

This study offers unique perspectives and analyses of peace and conflict journalism through interpretative repertoire, media frames, and critical discourse analyses. Two major English publications in Pakistan, representing both long and short-form journalism, are investigated to uncover how the Afghan refugee repatriation from Pakistan in 2016-17 has been framed in Pakistani English media. Peace journalism focuses on concepts such as peace initiatives and peace building, finding common ground, and preventing further conflict. This study applies Jake Lynch’s Coding Criteria to guide the critical discourse analysis and Lee and Maslog’s Peace Journalism Quotient to examine the extent of peace journalism in each text. This study finds that peace journalism is missing in Pakistani English press, but represented, to an extent, in long-form print and online coverage. Two new alternative frames are also proposed. This study gives an in-depth understanding of if and how journalists in Pakistan are covering conflicts and framing stories that can be identified as peace journalism. This study represents significant contributions to the remarkably limited scholarship on peace and conflict journalism in Pakistan and extends Shabbir Hussain’s work on critical pragmatic perspectives on peace journalism in Pakistan.

Keywords: Afghan refugee repatriation, Critical discourse analysis, Media framing , Peace and conflict journalism

Procedia PDF Downloads 198
9707 Robot Spatial Reasoning via 3D Models

Authors: John Allard, Alex Rich, Iris Aguilar, Zachary Dodds

Abstract:

With this paper we present several experiences deploying novel, low-cost resources for computing with 3D spatial models. Certainly, computing with 3D models undergirds some of our field’s most important contributions to the human experience. Most often, those are contrived artifacts. This work extends that tradition by focusing on novel resources that deliver uncontrived models of a system’s current surroundings. Atop this new capability, we present several projects investigating the student-accessibility of the computational tools for reasoning about the 3D space around us. We conclude that, with current scaffolding, real-world 3D models are now an accessible and viable foundation for creative computational work.

Keywords: 3D vision, matterport model, real-world 3D models, mathematical and computational methods

Procedia PDF Downloads 526
9706 The Development of a New Block Method for Solving Stiff ODEs

Authors: Khairil I. Othman, Mahfuzah Mahayaddin, Zarina Bibi Ibrahim

Abstract:

We develop and demonstrate a computationally efficient numerical technique to solve first order stiff differential equations. This technique is based on block method whereby three approximate points are calculated. The Cholistani of varied step sizes are presented in divided difference form. Stability regions of the formulae are briefly discussed in this paper. Numerical results show that this block method perform very well compared to existing methods.

Keywords: block method, divided difference, stiff, computational

Procedia PDF Downloads 415
9705 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption

Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme

Procedia PDF Downloads 370
9704 Explanation Conceptual Model of the Architectural Form Effect on Structures in Building Aesthetics

Authors: Fatemeh Nejati, Farah Habib, Sayeh Goudarzi

Abstract:

Architecture and structure have always been closely interrelated so that they should be integrated into a unified, coherent and beautiful universe, while in the contemporary era, both structures and architecture proceed separately. The purpose of architecture is the art of creating form and space and order for human service, and the goal of the structural engineer is the transfer of loads to the structure, too. This research seeks to achieve the goal by looking at the relationship between the form of architecture and structure from its inception to the present day to the Global Identification and Management Plan. Finally, by identifying the main components of the design of the structure in interaction with the architectural form, an effective step is conducted in the Professional training direction and solutions to professionals. Therefore, after reviewing the evolution of structural and architectural coordination in various historical periods as well as how to reach the form of the structure in different times and places, components are required to test the components and present the final theory that one hundred to be tested in this regard. Finally, this research indicates the fact that the form of architecture and structure has an aesthetic link, which is influenced by a number of components that could be edited and has a regular order throughout history that could be regular. The research methodology is analytic, and it is comparative using analytical and matrix diagrams and diagrams and tools for conducting library research and interviewing.

Keywords: architecture, structural form, structural and architectural coordination, effective components, aesthetics

Procedia PDF Downloads 206
9703 Linking Business Owners’ Choice of Organizational Form to Appraisers’ Determination of Value: An Agency Theory Perspective

Authors: Majdi Anwar Quttainah, William Paczkowski, Ali Muhammad

Abstract:

Determining the value of a privately held firms confound those in academia as well as practitioners in the fields of appraisal, forensic accounting, and law. Divergent parties to the transfer look to apply the valuation technique to serve their own best interests. This paper seeks to explore how agency theory induces owners to choose the form of their businesses at inception and how this choice will affect the appraisers’ valuation of the firm at the transfer of ownership.

Keywords: organizational form, agency theory, value

Procedia PDF Downloads 425
9702 Unstructured Learning: Development of Free Form Construction in Waldorf and Normative Preschools

Authors: Salam Kodsi

Abstract:

In this research, we sought to focus on constructive play and examine its components in the context of two different educational approaches: Waldorf and normative schools. When they are free to choose, construction is one of the forms of play most favored by children. Its short-term and long-term cognitive contributions are apparent in various areas of development. The lack of empirical studies about play in Waldorf schools, which addresses the possibility of this incidental learning inspired the need to enrich the body of existing knowledge. 90 children (4-6 yrs.old) four preschools ( two normative, two Waldorf) participated in a small homogeneous city. Naturalistic observations documented the time frame, physical space, and construction materials related to the freeform building; processes of construction among focal representative children and its products. The study’s main finding with respect to the construction output points to a connection between educational approach and level of construction sophistication. Higher levels of sophistication were found at the Waldorf preschools than at the mainstream preschools. This finding emerged due to the differences in the level of sophistication among the older children in the two types of preschools, while practically no differences emerged among the younger children. Discussion of the research findings considered the differences between the play environments in terms of time, physical space, and construction materials. The construction processes were characterized according to the design model stages. The construction output was characterized according to the sophistication scale dimensions and the connections between approach, age and gender, and sophistication level.

Keywords: constructive play, preschool, design process model, complexity

Procedia PDF Downloads 112
9701 Flexible Design Solutions for Complex Free form Geometries Aimed to Optimize Performances and Resources Consumption

Authors: Vlad Andrei Raducanu, Mariana Lucia Angelescu, Ion Cinca, Vasile Danut Cojocaru, Doina Raducanu

Abstract:

By using smart digital tools, such as generative design (GD) and digital fabrication (DF), problems of high actuality concerning resources optimization (materials, energy, time) can be solved and applications or products of free-form type can be created. In the new digital technology materials are active, designed in response to a set of performance requirements, which impose a total rethinking of old material practices. The article presents the design procedure key steps of a free-form architectural object - a column type one with connections to get an adaptive 3D surface, by using the parametric design methodology and by exploiting the properties of conventional metallic materials. In parametric design the form of the created object or space is shaped by varying the parameters values and relationships between the forms are described by mathematical equations. Digital parametric design is based on specific procedures, as shape grammars, Lindenmayer - systems, cellular automata, genetic algorithms or swarm intelligence, each of these procedures having limitations which make them applicable only in certain cases. In the paper the design process stages and the shape grammar type algorithm are presented. The generative design process relies on two basic principles: the modeling principle and the generative principle. The generative method is based on a form finding process, by creating many 3D spatial forms, using an algorithm conceived in order to apply its generating logic onto different input geometry. Once the algorithm is realized, it can be applied repeatedly to generate the geometry for a number of different input surfaces. The generated configurations are then analyzed through a technical or aesthetic selection criterion and finally the optimal solution is selected. Endless range of generative capacity of codes and algorithms used in digital design offers various conceptual possibilities and optimal solutions for both technical and environmental increasing demands of building industry and architecture. Constructions or spaces generated by parametric design can be specifically tuned, in order to meet certain technical or aesthetical requirements. The proposed approach has direct applicability in sustainable architecture, offering important potential economic advantages, a flexible design (which can be changed until the end of the design process) and unique geometric models of high performance.

Keywords: parametric design, algorithmic procedures, free-form architectural object, sustainable architecture

Procedia PDF Downloads 362
9700 Quantitative Method of Measurement for the Rights and Obligations of Contracting Parties in Standard Forms of Contract in Malaysia: A Case Study

Authors: Sim Nee Ting, Lan Eng Ng

Abstract:

Standard forms of contract in Malaysia are pre-written, printed contractual documents drafted by recognised authoritative bodies in order to describe the rights and obligations of the contracting parties in all construction projects in Malaysia. Studies and form revisions are usually conducted in a relatively random and qualitative manner, but the search of contractual documents idealization remains. It is not clear how these qualitative findings could be helpful for contractual documents improvements and re-drafting. This study aims to quantitatively and systematically analyse and evaluate the rights and obligations of the contracting parties as stated in the standard forms of contract. The Institution of Engineers Malaysia (IEM) published a new standard form of contract in 2012 with a total of 63 classes but the improvements and changes in the newly revised form that are yet to be analysed. IEM form will be used as the case study for this study. Every clause in this said form were interpreted and analysed according to the involved parties including contractor, engineer and employer. Modified from Matrix Method and Likert Scale, the result analysis were conducted based on a scale from 0 to 1 with five ratings namely “Very Unbalance”, “Unbalance”, “Balance”, “Good Balance” and “Very Good Balance”. It is hoped that quantitative method of form study can be used for future form revisions and any new forms drafting so to reduce on any subjectivity in standard forms of contract studies.

Keywords: contracting parties, Malaysia, obligations, quantitative measurement, rights, standard form of contract

Procedia PDF Downloads 255
9699 A Four-Step Ortho-Rectification Procedure for Geo-Referencing Video Streams from a Low-Cost UAV

Authors: B. O. Olawale, C. R. Chatwin, R. C. D. Young, P. M. Birch, F. O. Faithpraise, A. O. Olukiran

Abstract:

Ortho-rectification is the process of geometrically correcting an aerial image such that the scale is uniform. The ortho-image formed from the process is corrected for lens distortion, topographic relief, and camera tilt. This can be used to measure true distances, because it is an accurate representation of the Earth’s surface. Ortho-rectification and geo-referencing are essential to pin point the exact location of targets in video imagery acquired at the UAV platform. This can only be achieved by comparing such video imagery with an existing digital map. However, it is only when the image is ortho-rectified with the same co-ordinate system as an existing map that such a comparison is possible. The video image sequences from the UAV platform must be geo-registered, that is, each video frame must carry the necessary camera information before performing the ortho-rectification process. Each rectified image frame can then be mosaicked together to form a seamless image map covering the selected area. This can then be used for comparison with an existing map for geo-referencing. In this paper, we present a four-step ortho-rectification procedure for real-time geo-referencing of video data from a low-cost UAV equipped with multi-sensor system. The basic procedures for the real-time ortho-rectification are: (1) Decompilation of video stream into individual frames; (2) Finding of interior camera orientation parameters; (3) Finding the relative exterior orientation parameters for each video frames with respect to each other; (4) Finding the absolute exterior orientation parameters, using self-calibration adjustment with the aid of a mathematical model. Each ortho-rectified video frame is then mosaicked together to produce a 2-D planimetric mapping, which can be compared with a well referenced existing digital map for the purpose of georeferencing and aerial surveillance. A test field located in Abuja, Nigeria was used for testing our method. Fifteen minutes video and telemetry data were collected using the UAV and the data collected were processed using the four-step ortho-rectification procedure. The results demonstrated that the geometric measurement of the control field from ortho-images are more reliable than those from original perspective photographs when used to pin point the exact location of targets on the video imagery acquired by the UAV. The 2-D planimetric accuracy when compared with the 6 control points measured by a GPS receiver is between 3 to 5 meters.

Keywords: geo-referencing, ortho-rectification, video frame, self-calibration

Procedia PDF Downloads 471
9698 GPU Accelerated Fractal Image Compression for Medical Imaging in Parallel Computing Platform

Authors: Md. Enamul Haque, Abdullah Al Kaisan, Mahmudur R. Saniat, Aminur Rahman

Abstract:

In this paper, we have implemented both sequential and parallel version of fractal image compression algorithms using CUDA (Compute Unified Device Architecture) programming model for parallelizing the program in Graphics Processing Unit for medical images, as they are highly similar within the image itself. There is several improvements in the implementation of the algorithm as well. Fractal image compression is based on the self similarity of an image, meaning an image having similarity in majority of the regions. We take this opportunity to implement the compression algorithm and monitor the effect of it using both parallel and sequential implementation. Fractal compression has the property of high compression rate and the dimensionless scheme. Compression scheme for fractal image is of two kinds, one is encoding and another is decoding. Encoding is very much computational expensive. On the other hand decoding is less computational. The application of fractal compression to medical images would allow obtaining much higher compression ratios. While the fractal magnification an inseparable feature of the fractal compression would be very useful in presenting the reconstructed image in a highly readable form. However, like all irreversible methods, the fractal compression is connected with the problem of information loss, which is especially troublesome in the medical imaging. A very time consuming encoding process, which can last even several hours, is another bothersome drawback of the fractal compression.

Keywords: accelerated GPU, CUDA, parallel computing, fractal image compression

Procedia PDF Downloads 322
9697 Shuffled Structure for 4.225 GHz Antireflective Plates: A Proposal Proven by Numerical Simulation

Authors: Shin-Ku Lee, Ming-Tsu Ho

Abstract:

A newly proposed antireflective selector with shuffled structure is reported in this paper. The proposed idea is made of two different quarter wavelength (QW) slabs and numerically supported by the one-dimensional simulation results provided by the method of characteristics (MOC) to function as an antireflective selector. These two QW slabs are characterized by dielectric constants εᵣA and εᵣB, uniformly divided into N and N+1 pieces respectively which are then shuffled to form an antireflective plate with B(AB)N structure such that there is always one εᵣA piece between two εᵣB pieces. Another is A(BA)N structure where every εᵣB piece is sandwiched by two εᵣA pieces. Both proposed structures are numerically proved to function as QW plates. In order to allow maximum transmission through the proposed structures, the two dielectric constants are chosen to have the relation of (εᵣA)² = εᵣB > 1. The advantages of the proposed structures over the traditional anti-reflection coating techniques are two components with two thicknesses and to shuffle to form new QW structures. The design wavelength used to validate the proposed idea is 71 mm corresponding to a frequency about 4.225 GHz. The computational results are shown in both time and frequency domains revealing that the proposed structures produce minimum reflections around the frequency of interest.

Keywords: method of characteristics, quarter wavelength, anti-reflective plate, propagation of electromagnetic fields

Procedia PDF Downloads 143
9696 Analysis on Urban Form and Evolution Mechanism of High-Density City: Case Study of Hong Kong

Authors: Yuan Zhang

Abstract:

Along with large population and great demands for urban development, Hong Kong serves as a typical high-density city with multiple altitudes, advanced three-dimensional traffic system, rich city open space, etc. This paper contributes to analyzing its complex urban form and evolution mechanism from three aspects of view, separately as time, space and buildings. Taking both horizontal and vertical dimension into consideration, this paper provides a perspective to explore the fascinating process of growing and space folding in the urban form of high-density city, also as a research reference for related high-density urban design.

Keywords: evolution mechanism, high-density city, Hong Kong, urban form

Procedia PDF Downloads 394
9695 Computational Fluid Dynamics Analysis for Radon Dispersion Study and Mitigation

Authors: A. K. Visnuprasad, P. J. Jojo, Reshma Bhaskaran

Abstract:

Computational fluid dynamics (CFD) is used to simulate the distribution of indoor radon concentration in a living room with elevated levels of radon concentration which varies from 22 Bqm-3 to 1533 Bqm-3 in 24 hours. Finite volume method (FVM) was used for the simulation. The simulation results were experimentally validated at 16 points in two horizontal planes (y=1.4m & y=2.0m) using pin-hole dosimeters and at 3 points using scintillation radon monitor (SRM). Passive measurement using pin-hole dosimeters were performed in all seasons. Another simulation was done to find a suitable position for a passive ventilation system for the effective mitigation of radon.

Keywords: indoor radon, computational fluid dynamics, radon flux, ventilation rate, pin-hole dosimeter

Procedia PDF Downloads 402
9694 Investigation of the Physical Computing in Computational Thinking Practices, Computer Programming Concepts and Self-Efficacy for Crosscutting Ideas in STEM Content Environments

Authors: Sarantos Psycharis

Abstract:

Physical Computing, as an instructional model, is applied in the framework of the Engineering Pedagogy to teach “transversal/cross-cutting ideas” in a STEM content approach. Labview and Arduino were used in order to connect the physical world with real data in the framework of the so called Computational Experiment. Tertiary prospective engineering educators were engaged during their course and Computational Thinking (CT) concepts were registered before and after the intervention across didactic activities using validated questionnaires for the relationship between self-efficacy, computer programming, and CT concepts when STEM content epistemology is implemented in alignment with the Computational Pedagogy model. Results show a significant change in students’ responses for self-efficacy for CT before and after the instruction. Results also indicate a significant relation between the responses in the different CT concepts/practices. According to the findings, STEM content epistemology combined with Physical Computing should be a good candidate as a learning and teaching approach in university settings that enhances students’ engagement in CT concepts/practices.

Keywords: arduino, computational thinking, computer programming, Labview, self-efficacy, STEM

Procedia PDF Downloads 108
9693 Preparation and In vitro Characterization of Nanoparticle Hydrogel for Wound Healing

Authors: Rajni Kant Panik

Abstract:

The aim of the present study was to develop and evaluate mupirocin loaded nanoparticle incorporated into hydrogel as an infected wound healer. Incorporated Nanoparticle in hydrogel provides a barrier that effectively prevents the contamination of the wound and further progression of infection to deeper tissues. Hydrogel creates moist healing environment on wound space with good fluid absorbance. Nanoparticles were prepared by double emulsion solvent evaporation method using different ratios of PLGA polymer and the hydrogels was developed using sodium alginate and gelatin. Further prepared nanoparticles were then incorporated into the hydrogels. The formulations were characterized by FT-IR and DSC for drug and polymer compatibility and surface morphology was studied by TEM. Nanoparticle hydrogel were evaluated for their size, shape, encapsulation efficiency and for in vitro studies. The FT-IR and DSC confirmed the absence of any drug polymer interaction. The average size of Nanoparticle was found to be in range of 208.21-412.33 nm and shape was found to be spherical. The maximum encapsulation efficiency was found to be 69.03%. The in vitro release profile of Nanoparticle incorporated hydrogel formulation was found to give sustained release of drug. Antimicrobial activity testing confirmed that encapsulated drug preserve its effectiveness. The stability study confirmed that the formulation prepared were stable. Present study complements our finding that mupirocin loaded Nanoparticle incorporated into hydrogel has the potential to be an effective and safe novel addition for the release of mupirocin in sustained manner, which may be a better option for the management of wound. These finding also supports the progression of antibiotic via hydrogel delivery system is a novel topical dosage form for the management of wound.

Keywords: hydrogel, nanoparticle, PLGA, wound healing

Procedia PDF Downloads 304
9692 Unconventional Calculus Spreadsheet Functions

Authors: Chahid K. Ghaddar

Abstract:

The spreadsheet engine is exploited via a non-conventional mechanism to enable novel worksheet solver functions for computational calculus. The solver functions bypass inherent restrictions on built-in math and user defined functions by taking variable formulas as a new type of argument while retaining purity and recursion properties. The enabling mechanism permits integration of numerical algorithms into worksheet functions for solving virtually any computational problem that can be modelled by formulas and variables. Several examples are presented for computing integrals, derivatives, and systems of deferential-algebraic equations. Incorporation of the worksheet solver functions with the ubiquitous spreadsheet extend the utility of the latter as a powerful tool for computational mathematics.

Keywords: calculus, differential algebraic equations, solvers, spreadsheet

Procedia PDF Downloads 348
9691 A Polynomial Time Clustering Algorithm for Solving the Assignment Problem in the Vehicle Routing Problem

Authors: Lydia Wahid, Mona F. Ahmed, Nevin Darwish

Abstract:

The vehicle routing problem (VRP) consists of a group of customers that needs to be served. Each customer has a certain demand of goods. A central depot having a fleet of vehicles is responsible for supplying the customers with their demands. The problem is composed of two subproblems: The first subproblem is an assignment problem where the number of vehicles that will be used as well as the customers assigned to each vehicle are determined. The second subproblem is the routing problem in which for each vehicle having a number of customers assigned to it, the order of visits of the customers is determined. Optimal number of vehicles, as well as optimal total distance, should be achieved. In this paper, an approach for solving the first subproblem (the assignment problem) is presented. In the approach, a clustering algorithm is proposed for finding the optimal number of vehicles by grouping the customers into clusters where each cluster is visited by one vehicle. Finding the optimal number of clusters is NP-hard. This work presents a polynomial time clustering algorithm for finding the optimal number of clusters and solving the assignment problem.

Keywords: vehicle routing problems, clustering algorithms, Clarke and Wright Saving Method, agglomerative hierarchical clustering

Procedia PDF Downloads 386
9690 Simulations of NACA 65-415 and NACA 64-206 Airfoils Using Computational Fluid Dynamics

Authors: David Nagy

Abstract:

This paper exemplifies the influence of the purpose of an aircraft on the aerodynamic properties of its airfoil. In particular, the research takes into consideration two types of aircraft, namely cargo aircraft and military high-speed aircraft and compares their airfoil characteristics using their NACA airfoils as well as computational fluid dynamics. The results show that airfoils of aircraft designed for cargo have a heavier focus on maintaining a large lift force whereas speed-oriented airplanes focus on minimizing the drag force.

Keywords: aerodynamic simulation, aircraft, airfoil, computational fluid dynamics, lift to drag ratio, NACA 64-206, NACA 65-415

Procedia PDF Downloads 354
9689 Computational Approach to the Interaction of Neurotoxins and Kv1.3 Channel

Authors: Janneth González, George Barreto, Ludis Morales, Angélica Sabogal

Abstract:

Sea anemone neurotoxins are peptides that interact with Na+ and K+ channels, resulting in specific alterations on their functions. Some of these neurotoxins (1ROO, 1BGK, 2K9E, 1BEI) are important for the treatment of nearly eighty autoimmune disorders due to their specificity for Kv1.3 channel. The aim of this study was to identify the common residues among these neurotoxins by computational methods, and establish whether there is a pattern useful for the future generation of a treatment for autoimmune diseases. Our results showed eight new key common residues between the studied neurotoxins interacting with a histidine ring and the selectivity filter of the receptor, thus showing a possible pattern of interaction. This knowledge may serve as an input for the design of more promising drugs for autoimmune treatments.

Keywords: neurotoxins, potassium channel, Kv1.3, computational methods, autoimmune diseases

Procedia PDF Downloads 367
9688 Form of Social Quality Moving Process of Suburb Communities in a Changing World

Authors: Supannee Chaiumporn

Abstract:

This article is to introduce the meaning and form of social quality moving process as indicated by members of two suburb communities with different social and cultural contexts. The form of social quality moving process is very significant for the community and social development, because it will make the people living together with sustainable happiness. This is a qualitative study involving 30 key-informants from two suburb communities. Data were collected though key-informant interviews, and analyzed using logical content description and descriptive statistics. This research found that on the social quality component, the people in both communities stressed the procedure for social quality-making. This includes the generousness, sharing and assisting among people in the communities. These practices helped making people to live together with sustainable happiness. Living as a family or appear to be a family is the major social characteristic of these two communities. This research also found that form of social quality’s moving process of both communities stress relation of human and nature; “nature overpower humans” paradigm and influence of religious doctrine that emphasizes relations among humans. Both criteria make the form of social’s moving process simple, adaptive to nature and caring for opinion sharing and understanding among each other before action. This form of social quality’s moving process is composed of 4 steps; (1) awareness building, (2) motivation to change, (3) participation from every party concerned (4) self-reliance.

Keywords: social quality, form of social quality moving process, happiness, different social and cultural context

Procedia PDF Downloads 367
9687 Detecting the Edge of Multiple Images in Parallel

Authors: Prakash K. Aithal, U. Dinesh Acharya, Rajesh Gopakumar

Abstract:

Edge is variation of brightness in an image. Edge detection is useful in many application areas such as finding forests, rivers from a satellite image, detecting broken bone in a medical image etc. The paper discusses about finding edge of multiple aerial images in parallel .The proposed work tested on 38 images 37 colored and one monochrome image. The time taken to process N images in parallel is equivalent to time taken to process 1 image in sequential. The proposed method achieves pixel level parallelism as well as image level parallelism.

Keywords: edge detection, multicore, gpu, opencl, mpi

Procedia PDF Downloads 469
9686 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 470
9685 Computational Neurosciences: An Inspiration from Biological Neurosciences

Authors: Harsh Sadawarti, Kamal Malik

Abstract:

Humans are the unique and the most powerful creature on this planet just because of the high level of intelligence gifted by nature. Computational Intelligence is highly influenced by the term natural intelligence, neurosciences and mathematics. To deal with the in-depth study of computational intelligence and to utilize it in real-life applications, it is quite important to understand its simulation with the human brain. In this paper, the three important parts, Frontal Lobe, Occipital Lobe and Parietal Lobe of the human brain, are compared with the ANN(Artificial Neural Network), CNN(Convolutional Neural network), and RNN(Recurrent Neural Network), respectively. Intelligent computational systems are created by combining deductive reasoning, logical concepts and high-level algorithms with the simulation and study of the human brain. Human brain is a combination of Physiology, Psychology, emotions, calculations and many other parameters which are of utmost importance that determines the overall intelligence. To create intelligent algorithms, smart machines and to simulate the human brain in an effective manner, it is quite important to have an insight into the human brain and the basic concepts of biological neurosciences.

Keywords: computational intelligence, neurosciences, convolutional neural network, recurrent neural network, artificial neural network, frontal lobe, occipital lobe, parietal lobe

Procedia PDF Downloads 103
9684 Building Semantic-Relatedness Thai Word Ontology for Semantic Analysis

Authors: Gridaphat Sriharee

Abstract:

Building semantic-relatedness Thai word ontology can be implemented by considering word forms and word meaning. This research proposed the methodology for building the ontology, which can be used for semantic analysis. There are four categories of words: similar form and the same meaning, similar form and similar meaning, different form and opposite/same meaning, and different form and similar meaning, which will be used as initial words for building the proposed ontology. Extension of the ontology can be augmented by considering the messages that give the meaning of the word from the dictionaries. Exploiting WordNet to construct the proposed ontology was investigated and discussed. The proposed ontology was evaluated for its quality. With the proposed methodology, it is promising that the constructed ontology is a well-defined ontology.

Keywords: Thai, NLP, semantics, ontology

Procedia PDF Downloads 83
9683 A Fast Convergence Subband BSS Structure

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin

Abstract:

A blind source separation method is proposed; in this method we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each sub-band than the input signal at full bandwidth, and can promote better rates of convergence.

Keywords: blind source separation, computational complexity, subband, convergence speed, mixture

Procedia PDF Downloads 541
9682 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: web log data, web user profile, user interest, noise web data learning, machine learning

Procedia PDF Downloads 256
9681 A Simplified Distribution for Nonlinear Seas

Authors: M. A. Tayfun, M. A. Alkhalidi

Abstract:

The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is re-examined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.

Keywords: ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness

Procedia PDF Downloads 426