Search results for: generalized inverse matrix approach
4152 Long-Term Economic-Ecological Assessment of Optimal Local Heat-Generating Technologies for the German Unrefurbished Residential Building Stock on the Quarter Level
Authors: M. A. Spielmann, L. Schebek
Abstract:
In order to reach the long-term national climate goals of the German government for the building sector, substantial energetic measures have to be executed. Historically, those measures were primarily energetic efficiency measures at the buildings’ shells. Advanced technologies for the on-site generation of heat (or other types of energy) often are not feasible at this small spatial scale of a single building. Therefore, the present approach uses the spatially larger dimension of a quarter. The main focus of the present paper is the long-term economic-ecological assessment of available decentralized heat-generating (CHP power plants and electrical heat pumps) technologies at the quarter level for the German unrefurbished residential buildings. Three distinct terms have to be described methodologically: i) Quarter approach, ii) Economic assessment, iii) Ecological assessment. The quarter approach is used to enable synergies and scaling effects over a single-building. For the present study, generic quarters that are differentiated according to significant parameters concerning their heat demand are used. The core differentiation of those quarters is made by the construction time period of the buildings. The economic assessment as the second crucial parameter is executed with the following structure: Full costs are quantized for each technology combination and quarter. The investment costs are analyzed on an annual basis and are modeled with the acquisition of debt. Annuity loans are assumed. Consequently, for each generic quarter, an optimal technology combination for decentralized heat generation is provided in each year of the temporal boundaries (2016-2050). The ecological assessment elaborates for each technology combination and each quarter a Life Cycle assessment. The measured impact category hereby is GWP 100. The technology combinations for heat production can be therefore compared against each other concerning their long-term climatic impacts. Core results of the approach can be differentiated to an economic and ecological dimension. With an annual resolution, the investment and running costs of different energetic technology combinations are quantified. For each quarter an optimal technology combination for local heat supply and/or energetic refurbishment of the buildings within the quarter is provided. Coherently to the economic assessment, the climatic impacts of the technology combinations are quantized and compared against each other.
Keywords: Building sector, heat, LCA, quarter level, systemic approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9464151 A Hybrid Approach for Color Image Quantization Using K-means and Firefly Algorithms
Authors: Parisut Jitpakdee, Pakinee Aimmanee, Bunyarit Uyyanonvara
Abstract:
Color Image quantization (CQ) is an important problem in computer graphics, image and processing. The aim of quantization is to reduce colors in an image with minimum distortion. Clustering is a widely used technique for color quantization; all colors in an image are grouped to small clusters. In this paper, we proposed a new hybrid approach for color quantization using firefly algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased algorithm that can be used for solving optimization problems. The proposed method can overcome the drawbacks of both algorithms such as the local optima converge problem in K-means and the early converge of firefly algorithm. Experiments on three commonly used images and the comparison results shows that the proposed algorithm surpasses both the base-line technique k-means clustering and original firefly algorithm.Keywords: Clustering, Color quantization, Firefly algorithm, Kmeans.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22184150 Fault-Tolerant Optimal Broadcast Algorithm for the Hypercube Topology
Authors: Lokendra Singh Umrao, Ravi Shankar Singh
Abstract:
This paper presents an optimal broadcast algorithm for the hypercube networks. The main focus of the paper is the effectiveness of the algorithm in the presence of many node faults. For the optimal solution, our algorithm builds with spanning tree connecting the all nodes of the networks, through which messages are propagated from source node to remaining nodes. At any given time, maximum n − 1 nodes may fail due to crashing. We show that the hypercube networks are strongly fault-tolerant. Simulation results analyze to accomplish algorithm characteristics under many node faults. We have compared our simulation results between our proposed method and the Fu’s method. Fu’s approach cannot tolerate n − 1 faulty nodes in the worst case, but our approach can tolerate n − 1 faulty nodes.Keywords: Fault tolerance, hypercube, broadcasting, link/node faults, routing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18824149 Project Objective Structure Model: An Integrated, Systematic and Balanced Approach in Order to Achieve Project Objectives
Authors: Mohammad Reza Oftadeh
Abstract:
The purpose of the article is to describe project objective structure (POS) concept that was developed on research activities and experiences about project management, Balanced Scorecard (BSC) and European Foundation Quality Management Excellence Model (EFQM Excellence Model). Furthermore, this paper tries to define a balanced, systematic, and integrated measurement approach to meet project objectives and project strategic goals based on a process-oriented model. In this paper, POS is suggested in order to measure project performance in the project life cycle. After using the POS model, the project manager can ensure in order to achieve the project objectives on the project charter. This concept can help project managers to implement integrated and balanced monitoring and control project work.Keywords: Project objectives, project performance management, PMBOK, key performance indicators, integration management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8124148 Communicative and Artistic Machines: A Survey of Models and Experiments on Artificial Agents
Authors: Artur Matuck, Guilherme F. Nobre
Abstract:
Machines can be either tool, media, or social agents. Advances in technology have been delivering machines capable of autonomous expression, both through communication and art. This paper deals with models (theoretical approach) and experiments (applied approach) related to artificial agents. On one hand it traces how social sciences' scholars have worked with topics such as text automatization, man-machine writing cooperation, and communication. On the other hand it covers how computer sciences' scholars have built communicative and artistic machines, including the programming of creativity. The aim is to present a brief survey on artificially intelligent communicators and artificially creative writers, and provide the basis to understand the meta-authorship and also to new and further man-machine co-authorship.
Keywords: Artificial communication, artificial creativity, artificial writers, meta-authorship, robotic art.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13124147 Reduction of False Positives in Head-Shoulder Detection Based on Multi-Part Color Segmentation
Authors: Lae-Jeong Park
Abstract:
The paper presents a method that utilizes figure-ground color segmentation to extract effective global feature in terms of false positive reduction in the head-shoulder detection. Conventional detectors that rely on local features such as HOG due to real-time operation suffer from false positives. Color cue in an input image provides salient information on a global characteristic which is necessary to alleviate the false positives of the local feature based detectors. An effective approach that uses figure-ground color segmentation has been presented in an effort to reduce the false positives in object detection. In this paper, an extended version of the approach is presented that adopts separate multipart foregrounds instead of a single prior foreground and performs the figure-ground color segmentation with each of the foregrounds. The multipart foregrounds include the parts of the head-shoulder shape and additional auxiliary foregrounds being optimized by a search algorithm. A classifier is constructed with the feature that consists of a set of the multiple resulting segmentations. Experimental results show that the presented method can discriminate more false positive than the single prior shape-based classifier as well as detectors with the local features. The improvement is possible because the presented approach can reduce the false positives that have the same colors in the head and shoulder foregrounds.
Keywords: Pedestrian detection, color segmentation, false positives, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11444146 An Interactive Ontology Visualization Approach for the Networked Home Environment
Authors: Ilkka Niskanen, Jarmo Kalaoja, Julia Kantorovitch, Toni Piirainen
Abstract:
Ontologies are broadly used in the context of networked home environments. With ontologies it is possible to define and store context information, as well as to model different kinds of physical environments. Ontologies are central to networked home environments as they carry the meaning. However, ontologies and the OWL language is complex. Several ontology visualization approaches have been developed to enhance the understanding of ontologies. The domain of networked home environments sets some special requirements for the ontology visualization approach. The visualization tool presented here, visualizes ontologies in a domain-specific way. It represents effectively the physical structures and spatial relationships of networked home environments. In addition, it provides extensive interaction possibilities for editing and manipulating the visualization. The tool shortens the gap from beginner to intermediate OWL ontology reader by visualizing instances in their actual locations and making OWL ontologies more interesting and concrete, and above all easier to comprehend.Keywords: Ontologies, visualization, interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13744145 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings
Authors: G. Candel, D. Naccache
Abstract:
t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.
Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4894144 Structural and Optical Properties of Pr3+ Doped ZnO and PVA:Zn98Pr2O Nanocomposite Free Standing Film
Authors: Pandiyarajan Thangaraj, Mangalaraja Ramalinga Viswanathan, Karthikeyan Balasubramanian, Héctor D. Mansilla, José Ruiz, David Contreras
Abstract:
In this work, we report, a systematic study on the structural and optical properties of Pr-doped ZnO nanostructures and PVA:Zn98Pr2O polymer matrix nanocomposites free standing films. These particles are synthesized through simple wet chemical route and solution casting technique at room temperature, respectively. Structural studies carried out by X-ray diffraction method confirm that the prepared pure ZnO and Pr doped ZnO nanostructures are in hexagonal wurtzite structure and the microstrain is increased upon doping. TEM analysis reveals that the prepared materials are in sheet like nature. Absorption spectra show free excitonic absorption band at 370 nm and red shift for the Pr doped ZnO nanostructures. The PVA:Zn98Pr2O composite film exhibits both free excitonic and PVA absorption bands at 282 nm. Fourier transform infrared spectral studies confirm the presence of A1 (TO) and E1 (TO) modes of Zn-O bond vibration and the formation of polymer composite materials.
Keywords: Pr doped ZnO, polymer nanocomposites, optical properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22304143 Laser-Ultrasonic Method for Measuring the Local Elastic Moduli of Porous Isotropic Composite Materials
Authors: Alexander A. Karabutov, Natalia B. Podymova, Elena B. Cherepetskaya, Vladimir A. Makarov, Yulia G. Sokolovskaya
Abstract:
The laser-ultrasonic method is realized for quantifying the influence of porosity on the local Young’s modulus of isotropic composite materials. The method is based on a laser thermooptical method of ultrasound generation combined with measurement of the phase velocity of longitudinal and shear acoustic waves in samples. The main advantage of this method compared with traditional ultrasonic research methods is the efficient generation of short and powerful probing acoustic pulses required for reliable testing of ultrasound absorbing and scattering heterogeneous materials. Using as an example samples of a metal matrix composite with reinforcing microparticles of silicon carbide in various concentrations, it is shown that to provide an effective increase in Young’s modulus with increasing concentration of microparticles, the porosity of the final sample should not exceed 2%.Keywords: Laser ultrasonic, longitudinal and shear ultrasonic waves, porosity, composite, local elastic moduli.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15494142 Data Mining Classification Methods Applied in Drug Design
Authors: Mária Stachová, Lukáš Sobíšek
Abstract:
Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.Keywords: data mining, classification, drug design, QSAR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28494141 Designing Creative Events with Deconstructivism Approach
Authors: Maryam Memarian, Mahmood Naghizadeh
Abstract:
Deconstruction is an approach that is entirely incompatible with the traditional prevalent architecture. Considering the fact that this approach attempts to put architecture in sharp contrast with its opposite events and transpires with attending to the neglected and missing aspects of architecture and deconstructing its stable structures. It also recklessly proceeds beyond the existing frameworks and intends to create a different and more efficient prospect for space. The aim of deconstruction architecture is to satisfy both the prospective and retrospective visions as well as takes into account all tastes of the present in order to transcend time. Likewise, it ventures to fragment the facts and symbols of the past and extract new concepts from within their heart, which coincide with today’s circumstances. Since this approach is an attempt to surpass the limits of the prevalent architecture, it can be employed to design places in which creative events occur and imagination and ambition flourish. Thought-provoking artistic events can grow and mature in such places and be represented in the best way possible to all people. The concept of event proposed in the plan grows out of the interaction between space and creation. In addition to triggering surprise and high impressions, it is also considered as a bold journey into the suspended realms of the traditional conflicts in architecture such as architecture-landscape, interior-exterior, center-margin, product-process, and stability-instability. In this project, at first, through interpretive-historical research method and examining the inputs and data collection, recognition and organizing takes place. After evaluating the obtained data using deductive reasoning, the data is eventually interpreted. Given the fact that the research topic is in its infancy and there is not a similar case in Iran with limited number of corresponding instances across the world, the selected topic helps to shed lights on the unrevealed and neglected parts in architecture. Similarly, criticizing, investigating and comparing specific and highly prized cases in other countries with the project under study can serve as an introduction into this architecture style.
Keywords: Creativity, deconstruction, event.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19014140 Attention-Based Spatio-Temporal Approach for Fire and Smoke Detection
Authors: A. Mirrashid, M. Khoshbin, A. Atghaei, H. Shahbazi
Abstract:
In various industries, smoke and fire are two of the most important threats in the workplace. One of the common methods for detecting smoke and fire is the use of infrared thermal and smoke sensors, which cannot be used in outdoor applications. Therefore, the use of vision-based methods seems necessary. The problem of smoke and fire detection is spatiotemporal and requires spatiotemporal solutions. This paper presents a method that uses spatial features along with temporal-based features to detect smoke and fire in the scene. It consists of three main parts; the task of each part is to reduce the error of the previous part so that the final model has a robust performance. This method also uses transformer modules to increase the accuracy of the model. The results of our model show the proper performance of the proposed approach in solving the problem of smoke and fire detection and can be used to increase workplace safety.
Keywords: Attention, fire detection, smoke detection, spatiotemporal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3574139 Effect of Linear Thermal Gradient on Steady-State Creep Behavior of Isotropic Rotating Disc
Authors: Minto Rattan, Tania Bose, Neeraj Chamoli
Abstract:
The present paper investigates the effect of linear thermal gradient on the steady-state creep behavior of rotating isotropic disc using threshold stress based Sherby’s creep law. The composite discs made of aluminum matrix reinforced with silicon carbide particulate has been taken for analysis. The stress and strain rate distributions have been calculated for discs rotating at linear thermal gradation using von Mises’ yield criterion. The material parameters have been estimated by regression fit of the available experimental data. The results are displayed and compared graphically in designer friendly format for the above said temperature profile with the disc operating under uniform temperature profile. It is observed that radial and tangential stresses show minor variation and the strain rates vary significantly in the presence of thermal gradation as compared to disc having uniform temperature.Keywords: Creep, isotropic, steady-state, thermal gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8464138 Optical and Double Folding Model Analysis for Alpha Particles Elastically Scattered from 9Be and 11B Nuclei at Different Energies
Authors: Ahmed H. Amer, A. Amar, Sh. Hamada, I. I. Bondouk, F. A. El-Hussiny
Abstract:
Elastic scattering of α-particles from 9Be and 11B nuclei at different alpha energies have been analyzed. Optical model parameters (OMPs) of α-particles elastic scattering by these nuclei at different energies have been obtained. In the present calculations, the real part of the optical potential are derived by folding of nucleonnucleon (NN) interaction into nuclear matter density distribution of the projectile and target nuclei using computer code FRESCO. A density-dependent version of the M3Y interaction (CDM3Y6), which is based on the G-matrix elements of the Paris NN potential, has been used. Volumetric integrals of the real and imaginary potential depth (JR, JW) have been calculated and found to be energy dependent. Good agreement between the experimental data and the theoretical predictions in the whole angular range. In double folding (DF) calculations, the obtained normalization coefficient Nr is in the range 0.70–1.32.Keywords: Elastic scattering of α-particles, optical model parameters, double folding model, nucleon-nucleon interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21964137 Verification of a Locked CFD Approach to Cool Down Modeling
Authors: P. Bárta
Abstract:
Increasing demand on the performance of Subsea Production Systems (SPS) suggests a need for more detailed investigation of fluid behavior taking place in subsea equipment. Complete CFD cool down analyses of subsea equipment are very time demanding. The objective of this paper is to investigate a Locked CFD approach, which enables significant reduction of the computational time and at the same time maintains sufficient accuracy during thermal cool down simulations. The result comparison of a dead leg simulation using the Full CFD and the three LCFD-methods confirms the validity of the locked flow field assumption for the selected case. For the tested case the LCFD simulation speed up by factor of 200 results in the absolute thermal error of 0.5 °C (3% relative error), speed up by factor of 10 keeps the LCFD results within 0.1 °C (0.5 % relative error) comparing to the Full CFD.Keywords: CFD, Locked Flow Field, Speed up of CFD simulation time, Subsea
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16744136 On Measuring the Reusability Proneness of Mobile Applications
Authors: Fathi Taibi
Abstract:
The abnormal increase in the number of applications available for download in Android markets is a good indication that they are being reused. However, little is known about their real reusability potential. A considerable amount of these applications is reported as having a poor quality or being malicious. Hence, in this paper, an approach to measure the reusability potential of classes in Android applications is proposed. The approach is not meant specifically for this particular type of applications. Rather, it is intended for Object-Oriented (OO) software systems in general and aims also to provide means to discard the classes of low quality and defect prone applications from being reused directly through inheritance and instantiation. An empirical investigation is conducted to measure and rank the reusability potential of the classes of randomly selected Android applications. The results obtained are thoroughly analyzed in order to understand the extent of this potential and the factors influencing it.
Keywords: Reusability, Software Quality Factors, Software Metrics, Empirical Investigation, Object-Oriented Software, Android Applications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18024135 Realignment of f-actin Cytoskeleton in Osteocytes after Mechanical Loading
Authors: R. S. A. Nesbitt, J. Macione, E. Babollah, B. Adu-baffour, S. P. Kotha
Abstract:
F-actin fibrils are the cytoskeleton of osteocytes. They react in a dynamic manner to mechanical loading, and strength and reposition their efforts to reinforce the cells structure. We hypothesize that f-actin is temporarly disrupted after loading and repolymerizes in a new orientation to oppose the applied load. In vitro studies are conducted to determine f-actin disruption after varying mechanical stimulus parameters that are known to affect bone formation. Results indicate that the f-actin cytoskeleton is disrupted in vitro as a function of applied mechanical stimulus parameters and that the f-actin bundles reassemble after loading induced disruption within 3 minutes after cessation of loading. The disruption of the factin cytoskeleton depends on the magnitude of stretch, the numbers of loading cycles, frequency, the insertion of rest between loading cycles and extracellular calcium. In vivo studies also demonstrate disruption of the f-actin cytoskeleton in cells embedded in the bone matrix immediately after mechanical loading. These studies suggest that adaptation of the f-actin fiber bundles of the cytoskeleton in response to applied loads occurs by disruption and subsequent repolymerization.Keywords: Mechanical loading of osteocytes, f-actin cytoskeleton, disruption, re-polymerization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15614134 Low-Cost Space-Based Geoengineering: An Assessment Based on Self-Replicating Manufacturing of in-Situ Resources on the Moon
Authors: Alex Ellery
Abstract:
Geoengineering approaches to climate change mitigation are unpopular and regarded with suspicion. Of these, space-based approaches are regarded as unworkable and enormously costly. Here, a space-based approach is presented that is modest in cost, fully controllable and reversible, and acts as a natural spur to the development of solar power satellites over the longer term as a clean source of energy. The low-cost approach exploits self-replication technology which it is proposed may be enabled by 3D printing technology. Self-replication of 3D printing platforms will enable mass production of simple spacecraft units. Key elements being developed are 3D-printable electric motors and 3D-printable vacuum tube-based electronics. The power of such technologies will open up enormous possibilities at low cost including space-based geoengineering.
Keywords: 3D printing, in-situ resource utilization, self-replication technology, space-based geoengineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11034133 Loop Back Connected Component Labeling Algorithm and Its Implementation in Detecting Face
Authors: A. Rakhmadi, M. S. M. Rahim, A. Bade, H. Haron, I. M. Amin
Abstract:
In this study, a Loop Back Algorithm for component connected labeling for detecting objects in a digital image is presented. The approach is using loop back connected component labeling algorithm that helps the system to distinguish the object detected according to their label. Deferent than whole window scanning technique, this technique reduces the searching time for locating the object by focusing on the suspected object based on certain features defined. In this study, the approach was also implemented for a face detection system. Face detection system is becoming interesting research since there are many devices or systems that require detecting the face for certain purposes. The input can be from still image or videos, therefore the sub process of this system has to be simple, efficient and accurate to give a good result.Keywords: Image processing, connected components labeling, face detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23004132 Emergentist Metaphorical Creativity: Towards a Model of Analysing Metaphorical Creativity in Interactive Talk
Authors: Afef Badri
Abstract:
Metaphorical creativity does not constitute a static property of discourse. It is an interactive dynamic process created online. There has been a lack of research concerning online produced metaphorical creativity. This paper intends to account for metaphorical creativity in online talk-in-interaction as a dynamic process that emerges as discourse unfolds. It brings together insights from the emergentist approach to the study of metaphor in verbal interactions and insights from conceptual blending approach as a model for analysing online metaphorical constructions to propose a model for studying metaphorical creativity in interactive talk. The model is based on three focal points. First, metaphorical creativity is a dynamic emergent and open-to-change process that evolves in real time as interlocutors constantly blend and re-blend previous metaphorical contributions. Second, it is not a product of isolated individual minds but a joint achievement that is co-constructed and co-elaborated by interlocutors. The third and most important point is that the emergent process of metaphorical creativity is tightly shaped by contextual variables surrounding talk-in-interaction. It is grounded in the framework of interpretation of interlocutors. It is constrained by preceding contributions in a way that creates textual cohesion of the verbal exchange and it is also a goal-oriented process predefined by the communicative intention of each participant in a way that reveals the ideological coherence/incoherence of the entire conversation.
Keywords: Communicative intention, conceptual blending, contextual variables, the emergentist approach, ideological coherence, metaphorical creativity, textual cohesion
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10484131 Nanobiocomposites with Enhanced Cell Proliferation and Improved Mechanical Properties Based on Organomodified-Nanoclay and Silicone Rubber
Authors: M. S. Hosseini, M. Tazzoli-Shadpour, I. Amjadi, A. A. Katbab, E. Jaefargholi-Rangraz
Abstract:
Bionanotechnology deals with nanoscopic interactions between nanostructured materials and biological systems. Polymer nanocomposites with optimized biological activity have attracted great attention. Nanoclay is considered as reinforcing nanofiller in manufacturing of high performance nanocomposites. In current study, organomodified-nanoclay with negatively charged silicate layers was incorporated into biomedical grade silicone rubber. Nanoparticle loading has been tailored to enhance cell behavior. Addition of nanoparticles led to improved mechanical properties of substrate with enhanced strength and stiffness while no toxic effects was observed. Results indicated improved viability and proliferation of cells by addition of nanofillers. The improved mechanical properties of the matrix result in proper cell response through adjustment and arrangement of cytoskeletal fibers. Results can be applied in tissue engineering when enhanced substrates are required for improvement of cell behavior for in vivo applications.
Keywords: Biocompatibility, Composite, Organomodified- Nanoclay, Proliferation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19414130 MIM: A Species Independent Approach for Classifying Coding and Non-Coding DNA Sequences in Bacterial and Archaeal Genomes
Authors: Achraf El Allali, John R. Rose
Abstract:
A number of competing methodologies have been developed to identify genes and classify DNA sequences into coding and non-coding sequences. This classification process is fundamental in gene finding and gene annotation tools and is one of the most challenging tasks in bioinformatics and computational biology. An information theory measure based on mutual information has shown good accuracy in classifying DNA sequences into coding and noncoding. In this paper we describe a species independent iterative approach that distinguishes coding from non-coding sequences using the mutual information measure (MIM). A set of sixty prokaryotes is used to extract universal training data. To facilitate comparisons with the published results of other researchers, a test set of 51 bacterial and archaeal genomes was used to evaluate MIM. These results demonstrate that MIM produces superior results while remaining species independent.Keywords: Coding Non-coding Classification, Entropy, GeneRecognition, Mutual Information.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17284129 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal
Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert
Abstract:
We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.Keywords: Computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27954128 Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images
Authors: Ali Zifan, Panos Liatsis, Panagiotis Kantartzis, Manolis Gavaises, Nicos Karcanias, Demosthenes Katritsis
Abstract:
We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.Keywords: Vessel enhancement, centerline extraction, symbiotic reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22724127 Patient-Specific Modeling Algorithm for Medical Data Based on AUC
Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper
Abstract:
Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.Keywords: Approach instance-based, area Under the ROC Curve, Patient-specific Decision Path, clinical predictions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15804126 Proposing Problem-Based Learning as an Effective Pedagogical Technique for Social Work Education
Authors: Christine K. Fulmer
Abstract:
Social work education is competency based in nature. There is an expectation that graduates of social work programs throughout the world are to be prepared to practice at a level of competence, which is beneficial to both the well-being of individuals and community. Experiential learning is one way to prepare students for competent practice. The use of Problem-Based Learning (PBL) is a form experiential education that has been successful in a number of disciplines to bridge the gap between the theoretical concepts in the classroom to the real world. PBL aligns with the constructivist theoretical approach to learning, which emphasizes the integration of new knowledge with the beliefs students already hold. In addition, the basic tenants of PBL correspond well with the practice behaviors associated with social work practice including multi-disciplinary collaboration and critical thinking. This paper makes an argument for utilizing PBL in social work education.
Keywords: Constructivist theoretical approach, experiential learning, pedagogy, problem-based learning, social work education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13334125 Multi-Criteria Based Robust Markowitz Model under Box Uncertainty
Authors: Pulak Swain, A. K. Ojha
Abstract:
Portfolio optimization is based on dealing with the problems of efficient asset allocation. Risk and Expected return are two conflicting criteria in such problems, where the investor prefers the return to be high and the risk to be low. Using multi-objective approach we can solve those type of problems. However the information which we have for the input parameters are generally ambiguous and the input values can fluctuate around some nominal values. We can not ignore the uncertainty in input values, as they can affect the asset allocation drastically. So we use Robust Optimization approach to the problems where the input parameters comes under box uncertainty. In this paper, we solve the multi criteria robust problem with the help of E- constraint method.Keywords: Portfolio optimization, multi-objective optimization, E-constraint method, box uncertainty, robust optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6224124 Speed Sensorless Control with a Linearizationby State Feedback of Asynchronous Machine Using a Model Reference Adaptive System
Authors: A. Larabi, M. S. Boucherit
Abstract:
In this paper, we show that the association of the PI regulators for the speed and stator currents with a control strategy using the linearization by state feedback for an induction machine without speed sensor, and with an adaptation of the rotor resistance. The rotor speed is estimated by using the model reference adaptive system approach (MRAS). This method consists of using two models: The first is the reference model and the second is an adjustable one in which two components of the stator flux, obtained from the measurement of the currents and stator voltages are estimated. The estimated rotor speed is then obtained by canceling the difference between stator-flux of the reference model and those of the adjustable one. Satisfactory results of simulation are obtained and discussed in this paper to highlight the proposed approach.Keywords: Asynchronous actuator, PI Regulator, adaptivemethod with reference model, Vector control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11164123 Modeling of Bio Scaffolds: Structural and Fluid Transport Characterization
Authors: Sahba Sadir, M. R. A. Kadir, A. Öchsner, M. N. Harun
Abstract:
Scaffolds play a key role in tissue engineering and can be produced in many different ways depending on the applications and the materials used. Most researchers used an experimental trialand- error approach into new biomaterials but computer simulation applied to tissue engineering can offer a more exhaustive approach to test and screen out biomaterials. This paper develops the model of scaffolds and Computational Fluid Dynamics that show the value of computer simulations in determining the influence of the geometrical scaffold parameter porosity, pore size and shape on the permeability of scaffolds, magnitude of velocity, drop pressure, shear stress distribution and level and the proper design of the geometry of the scaffold. This creates a need for more advanced studies that include aspects of dynamic conditions of a micro fluid passing through the scaffold were characterized for tissue engineering applications and differentiation of tissues within scaffolds.
Keywords: Scaffold engineering, Tissue engineering, Cellularstructure, Biomaterial, Computational fluid dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039