Search results for: graphical representation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1482

Search results for: graphical representation

972 Predicting Consolidation Coefficient of Busan Clay by Time-Displacement-Velocity Methods

Authors: Thang Minh Le, Hadi Khabbaz

Abstract:

The coefficient of consolidation is a parameter governing the rate at which saturated soil particularly clay undergoes consolidation when subjected to an increase in pressure. The rate and amount of compression in soil varies with the rate that pore water is lost; and hence depends on soil permeability. Over many years, various methods have been proposed to determine the coefficient of consolidation, cv, which is an indication of the rate of foundation settlement on soft ground. However, defining this parameter is often problematic and heavily relies on graphical techniques, which are subject to some uncertainties. This paper initially presents an overview of many well-established methods to determine the vertical coefficient of consolidation from the incremental loading consolidation tests. An array of consolidation tests was conducted on the undisturbed clay samples, collected at various depths from a site in Nakdong river delta, Busan, South Korea. The consolidation test results on these soft sensitive clay samples were employed to evaluate the targeted methods to predict the settlement rate of Busan clay. In relationship of time-displacement-velocity, a total of 3 method groups from 10 common procedures were classified and compared together. Discussions on study results will be also provided.

Keywords: Busan clay, coefficient of consolidation, constant rate of strain, incremental loading

Procedia PDF Downloads 183
971 The Representation of the Medieval Idea of Ugliness in Messiaen's Saint François d’Assise

Authors: Nana Katsia

Abstract:

This paper explores the ways both medieval and medievalist conceptions of ugliness might be linked to the physical and spiritual transformation of the protagonists and how it is realised through specific musical rhythm, such as the dochmiac rhythm in the opera. As Eco and Henderson note, only one kind of ugliness could be represented in conformity with nature in the Middle Ages without destroying all aesthetic pleasure and, in turn, artistic beauty: namely, a form of ugliness which arouses disgust. Moreover, Eco explores the fact that the enemies of Christ who condemn, martyr, and crucify him are represented as wicked inside. In turn, the representation of inner wickedness and hostility toward God brings with it outward ugliness, coarseness, barbarity, and rage. Ultimately these result in the deformation of the figure. In all these regards, the non-beautiful is represented here as a necessary phase, which is not the case with classical (the ancient Greek) concepts of Beauty. As we can see, the understanding of disfigurement and ugliness in the Middle Ages was both varied and complex. In the Middle Ages, the disfigurement caused by leprosy (and other skin and bodily conditions) was interpreted, in a somewhat contradictory manner, as both a curse and a gift from God. Some saints’ lives even have the saint appealing to be inflicted with the disease as part of their mission toward true humility. We shall explore that this ‘different concept’ of ugliness (non-classical beauty) might be represented in Messiaen’s opera. According to Messiaen, the Leper and Saint François are the principal characters of the third scene, as both of them will be transformed, and a double miracle will take place in the process. Messiaen mirrors the idea of the true humility of Saint’s life and positions Le Baiser au Lépreux as the culmination of the first act. The Leper’s character represents his physical and spiritual disfigurement, which are healed after the miracle. So, the scene can be viewed as an encounter between beauty and ugliness, and that much of it is spent in a study of ugliness. Dochmiac rhythm is one of the most important compositional elements in the opera. It plays a crucial role in the process of creating a dramatic musical narrative and structure in the composition. As such, we shall explore how Messiaen represents the medieval idea of ugliness in the opera through particular musical elements linked to the main protagonists’ spiritual or physical ugliness; why Messiaen makes reference to dochmiac rhythm, and how they create the musical and dramatic context in the opera for the medieval aesthetic category of ugliness.

Keywords: ugliness in music, medieval time, saint françois d’assise, messiaen

Procedia PDF Downloads 143
970 Creating a Digital Map to Monitor the Care of People Living with HIV/Aids in Porto Alegre, Brazil: An Experience Report

Authors: Tiago Sigal Linhares, Ana Amélia Nascimento da Silva Bones, Juliana Miola, McArthur Alexander Barrow, Airton Tetelbom Stein

Abstract:

Introduction: As a result of increased globalization and changing migration trends, it is expected that a significant portion of People Living with HIV/AIDS (PLWHA) will change their place of residence over time. In order to provide better health care, monitor the HIV epidemic and plan urban public health care and policies, there is a growing need to formulate a strategy for monitoring PLWHA care, location and migration patterns. The Porto Alegre District is characterized by a high prevalence of PLWHA and is considered one of the epicenters of HIV epidemic in Latin America. Objectives: The aim of this study is to create a digital and easily editable map in order to create a visual representation of the location of PLWHA and to monitor their migration within the city and the country in an effort to promote longitudinal care. Methods: This Experience Report used Google Maps Map Creator to generate an active digital map showing the location and changes in residence of 165 PLWHA who received care at two Primary Health Care (PHC) clinics, which attended an estimated population of five thousand patients, in downtown Porto Alegre over the last four years. Their current addresses were discovered in the unified Brazilian health care system digital records (e-SUS) and updated on the map. Results: A digital map with PLWHA current residence location was created. It was possible to demonstrate visually areas with a large concentration of PLWHA and the migration of the population within the city as wells as other cities, regions and states. Conclusions: An easily reproducible and free map could aid in PLWHA monitoring, urban public health planning, target interventions and situational diagnosis. Moreover, a visual representation of PLWHA location and migration could help bring more attention and investments to areas with geographic inequities or higher prevalence of PLWHA. It also enables notification of local PHC units of monitored patients inside their area, which are in clinical risk or with treatment abandonment through active case findings, improving the care of PLWHA.

Keywords: health care, medical public health, theoretical and conceptual innovations, urban public health

Procedia PDF Downloads 118
969 On the Problems of Human Concept Learning within Terminological Systems

Authors: Farshad Badie

Abstract:

The central focus of this article is on the fact that knowledge is constructed from an interaction between humans’ experiences and over their conceptions of constructed concepts. Logical characterisation of ‘human inductive learning over human’s constructed concepts’ within terminological systems and providing a logical background for theorising over the Human Concept Learning Problem (HCLP) in terminological systems are the main contributions of this research. This research connects with the topics ‘human learning’, ‘epistemology’, ‘cognitive modelling’, ‘knowledge representation’ and ‘ontological reasoning’.

Keywords: human concept learning, concept construction, knowledge construction, terminological systems

Procedia PDF Downloads 321
968 Design of a Real Time Heart Sounds Recognition System

Authors: Omer Abdalla Ishag, Magdi Baker Amien

Abstract:

Physicians used the stethoscope for listening patient heart sounds in order to make a diagnosis. However, the determination of heart conditions by acoustic stethoscope is a difficult task so it requires special training of medical staff. This study developed an accurate model for analyzing the phonocardiograph signal based on PC and DSP processor. The system has been realized into two phases; offline and real time phase. In offline phase, 30 cases of heart sounds files were collected from medical students and doctor's world website. For experimental phase (real time), an electronic stethoscope has been designed, implemented and recorded signals from 30 volunteers, 17 were normal cases and 13 were various pathologies cases, these acquired 30 signals were preprocessed using an adaptive filter to remove lung sounds. The background noise has been removed from both offline and real data, using wavelet transform, then graphical and statistics features vector elements were extracted, finally a look-up table was used for classification heart sounds cases. The obtained results of the implemented system showed accuracy of 90%, 80% and sensitivity of 87.5%, 82.4% for offline data, and real data respectively. The whole system has been designed on TMS320VC5509a DSP Platform.

Keywords: code composer studio, heart sounds, phonocardiograph, wavelet transform

Procedia PDF Downloads 436
967 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 235
966 Reverse Logistics Information Management Using Ontological Approach

Authors: F. Lhafiane, A. Elbyed, M. Bouchoum

Abstract:

Reverse Logistics (RL) Process is considered as complex and dynamic network that involves many stakeholders such as: suppliers, manufactures, warehouse, retails, and costumers, this complexity is inherent in such process due to lack of perfect knowledge or conflicting information. Ontologies, on the other hand, can be considered as an approach to overcome the problem of sharing knowledge and communication among the various reverse logistics partners. In this paper, we propose a semantic representation based on hybrid architecture for building the Ontologies in an ascendant way, this method facilitates the semantic reconciliation between the heterogeneous information systems (ICT) that support reverse logistics Processes and product data.

Keywords: Reverse Logistics, information management, heterogeneity, ontologies, semantic web

Procedia PDF Downloads 489
965 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine

Procedia PDF Downloads 121
964 Undocumented Migrants on the Northern Border of Mexico: Social Imaginary, and Social Representations

Authors: César Enrique Jiménez Yañez, Yessica Martinez Soto

Abstract:

In the present work, the phenomenon of undocumented migration in the northern border of Mexico is analyzed through the graphic representation of the experience of people who migrate in an undocumented way to the United States. 33 of them drew what it meant for them to migrate. Our objective is to analyze the social phenomenon of migration through the drawings of migrants, using the concepts of social imaginary and social representations, identifying the different significant elements with which this symbolically builds their experience. Drawing, as a methodological tool, will help us to understand the migratory experience beyond words.

Keywords: Mexico, social imaginary, social representations, undocumented migrants

Procedia PDF Downloads 390
963 Portrayal of Women in Television Advertisement

Authors: Priya Sarah Vijoy

Abstract:

The aim of this study is to analyze the Portrayal of women in Television Advertisements. This research study is conducted to analyze how women are portrayed in Television Advertisements. Advertising dates back to several hundreds of years. Right from the beginning, the seller wanted his goods to be sold and he used various techniques for achieving his objective. Advertisements have consistently confined women to traditional mother, home, or beauty/sex-oriented roles that are not representative of women’s diversity. Currently, in our society the television stereotyping of woman is the dominating forces in the media that degrade women and limit their representation. Thus the study analyzes how women are portrayed in Television advertisements and find whether roles of women in Television Advertisement are related to the product or not.

Keywords: advertising, stereotyping, television, women

Procedia PDF Downloads 434
962 The Geometrical Cosmology: The Projective Cast of the Collective Subjectivity of the Chinese Traditional Architectural Drawings

Authors: Lina Sun

Abstract:

Chinese traditional drawings related to buildings and construction apply a unique geometry differentiating with western Euclidean geometry and embrace a collection of special terminologies, under the category of tu (the Chinese character for drawing). This paper will on one side etymologically analysis the terminologies of Chinese traditional architectural drawing, and on the other side geometrically deconstruct the composition of tu and locate the visual narrative language of tu in the pictorial tradition. The geometrical analysis will center on selected series of Yang-shi-lei tu of the construction of emperors’ mausoleums in Qing Dynasty (1636-1912), and will also draw out the earlier architectural drawings and the architectural paintings such as the jiehua, and paintings on religious frescoes and tomb frescoes as the comparison. By doing these, this research will reveal that both the terminologies corresponding to different geometrical forms respectively indicate associations between architectural drawing and the philosophy of Chinese cosmology, and the arrangement of the geometrical forms in the visual picture plane facilitates expressions of the concepts of space and position in the geometrical cosmology. These associations and expressions are the collective intentions of architectural drawing evolving in the thousands of years’ tradition without breakage and irrelevant to the individual authorship. Moreover, the architectural tu itself as an entity, not only functions as the representation of the buildings but also express intentions and strengthen them by using the Chinese unique geometrical language flexibly and intentionally. These collective cosmological spatial intentions and the corresponding geometrical words and languages reveal that the Chinese traditional architectural drawing functions as a unique architectural site with subjectivity which exists parallel with buildings and express intentions and meanings by itself. The methodology and the findings of this research will, therefore, challenge the previous researches which treat architectural drawings just as the representation of buildings and understand the drawings more than just using them as the evidence to reconstruct the information of buildings. Furthermore, this research will situate architectural drawing in between the researches of Chinese technological tu and artistic painting, bridging the two academic areas which usually treated the partial features of architectural drawing separately. Beyond this research, the collective subjectivity of the Chinese traditional drawings will facilitate the revealing of the transitional experience from traditions to drawing modernity, where the individual subjective identities and intentions of architects arise. This research will root for the understanding both the ambivalence and affinity of the drawing modernity encountering the traditions.

Keywords: Chinese traditional architectural drawing (tu), etymology of tu, collective subjectivity of tu, geometrical cosmology in tu, geometry and composition of tu, Yang-shi-lei tu

Procedia PDF Downloads 117
961 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review

Procedia PDF Downloads 155
960 BIM Model and Virtual Prototyping in Construction Management

Authors: Samar Alkindy

Abstract:

Purpose: The BIM model has been used to support the planning of different construction projects in the industry by showing the different stages of the construction process. The model has been instrumental in identifying some of the common errors in the construction process through the spatial arrangement. The continuous use of the BIM model in the construction industry has resulted in various radical changes such as virtual prototyping. Construction virtual prototyping is a highly advanced technology that incorporates a BIM model with realistic graphical simulations, and facilitates the simulation of the project before a product is built in the factory. The paper presents virtual prototyping in the construction industry by examining its application, challenges and benefits to a construction project. Methodology approach: A case study was conducted for this study in four major construction projects, which incorporate virtual construction prototyping in several stages of the construction project. Furthermore, there was the administration of interviews with the project manager and engineer and the planning manager. Findings: Data collected from the methodological approach shows a positive response for virtual construction prototyping in construction, especially concerning communication and visualization. Furthermore, the use of virtual prototyping has increased collaboration and efficiency between construction experts handling a project. During the planning stage, virtual prototyping has increased accuracy, reduced planning time, and reduced the amount of rework during the implementation stage. Irrespective of virtual prototyping being a new concept in the construction industry, the findings outline that the approach will benefit the management of construction projects.

Keywords: construction operations, construction planning, process simulation, virtual prototyping

Procedia PDF Downloads 227
959 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact

Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed

Abstract:

Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).

Keywords: Bayesian network, classification, expert knowledge, structure learning, surface water analysis

Procedia PDF Downloads 124
958 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 375
957 Experimental and Graphical Investigation on Oil Recovery by Buckley-Leveret Theory

Authors: Khwaja Naweed Seddiqi, Zabihullah Mahdi, Shigeo Honma

Abstract:

Recently increasing oil production from petroleum reservoirs is one of the most important issues in the global energy sector. So, in this paper, the recovery of oil by the waterflooding technique from petroleum reservoir are considered. To investigate the aforementioned phenomena, the relative permeability of two immiscible fluids in sand is measured in the laboratory based on the steady-state method. Two sorts of oils, kerosene and heavy oil, and water are pumped simultaneously into a vertical sand column with different pumping ratio. From the change in fractional discharge measured at the outlet, a method for determining the relative permeability is developed focusing on the displacement mechanism in sand. Then, displacement mechanism of two immiscible fluids in the sand is investigated under the Buckley-Leveret frontal displacement theory and laboratory experiment. Two sorts of experiments, one is the displacement of pore water by oil, the other is the displacement of pore oil by water, are carried out. It is revealed that the relative permeability curves display tolerably different shape owing to the properties of oils, and produce different amount of residual oils and irreducible water saturation.

Keywords: petroleum reservoir engineering, relative permeability, two-phase flow, immiscible displacement in porous media, steady-state method, waterflooding

Procedia PDF Downloads 243
956 A Brief Study about Nonparametric Adherence Tests

Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim

Abstract:

The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.

Keywords: Kolmogorov-Smirnov test, Anderson-Darling test, Cramer-Von-Mises test, nonparametric adherence tests

Procedia PDF Downloads 440
955 The Multifunctional Medical Centers’ Architectural Shaping

Authors: Griaznova Svetlana, Umedov Mekhroz

Abstract:

The current healthcare facilities trend is the creation of multidisciplinary large-scale centers to provide the maximum possible services in one place, minimizing the number of possible instances in the path of patient treatment. The multifunctional medical centers are mainly designed in urban infrastructure for good accessibility. However, many functions and connections define the building shape, often make it inharmonious, that greatly destroys the city's appearance. The purpose of the research is to scientifically substantiate the factors influencing the shaping, the formation of architectural solutions principles, the formation of recommendations and principles for the multifunctional medical centers' design. The result of the research is the elaboration of architectural and planning solutions principles and the determination of factors affecting the multifunctional healthcare facilities shaping. Research method: Study and generalization of international experience in scientific research, literature, standards, teaching aids, and design materials on the topic of research. An integrated approach to the study of existing international experience of multidisciplinary medical centers. Elaboration of graphical analysis and diagrams based on the system analysis of the processed information. Identification of methods and principles of functional zoning of nuclear medicine centers.

Keywords: health care, multifunctionality, form, medical center, hospital, PET, CT scan

Procedia PDF Downloads 100
954 Optimization of Black-Litterman Model for Portfolio Assets Allocation

Authors: A. Hidalgo, A. Desportes, E. Bonin, A. Kadaoui, T. Bouaricha

Abstract:

Present paper is concerned with portfolio management with Black-Litterman (B-L) model. Considered stocks are exclusively limited to large companies stocks on US market. Results obtained by application of the model are presented. From analysis of collected Dow Jones stock data, remarkable explicit analytical expression of optimal B-L parameter τ, which scales dispersion of normal distribution of assets mean return, is proposed in terms of standard deviation of covariance matrix. Implementation has been developed in Matlab environment to split optimization in Markovitz sense from specific elements related to B-L representation.

Keywords: Black-Litterman, Markowitz, market data, portfolio manager opinion

Procedia PDF Downloads 254
953 The Miller Umwelt Assessment Scale: A Tool for Planning Interventions for Children on the Autism Spectrum

Authors: Sonia Mastrangelo

Abstract:

The Miller Umwelt Assessment Scale is a useful tool for obtaining information about the developmental capacities of children on the autism spectrum. The assessment, made up of 19 tasks in the areas of: body organization, contact with surroundings, expressive and receptive communication, representation, and social-emotional development, has been used with much success over the past 40 years. While many assessments are difficult to administer to children on the autism spectrum, the simplicity of the MUAS reveals key strengths and challenges for both low and high functioning children on the spectrum. The results guide parents and clinicians in providing a curriculum and/or home program that moves children up the developmental ladder.

Keywords: autism spectrum disorder, assessment, reading intervention, Miller method

Procedia PDF Downloads 530
952 The Representation of Young Sports Heroines in Cinema: Analysis of a Regressive Portrayal of Young Sportswomen on the Screen

Authors: David Sudre

Abstract:

Sport in cinema, like sport in society, has been mainly concerned with men and masculinity. Whether in the boxing ring, on the basketball playgrounds, or on the soccer fields, these films have mostly focused on the trials and tribulations of male athletes, for whom women have very generally played secondary, often devalued and devaluing roles, such as that of the loving and indispensable woman to the victorious athlete, that of the dangerous femme fatale, or that of the woman as a sexual object. For more than a century, this film genre has, on the contrary, symbolized the dominant values of patriotism, heroism and contributed at the same time to build an ideal of hegemonic masculinity. With the exception of films such as The Grand National (1944) and Million Dollar Baby (2004), the most commercially successful films tell the story of men's adventures in sports. Today, thanks in part to the struggles of the feminist movement and subsequent societal advances, we are seeing an increase in the number of women in increasingly prominent roles in sports films. Indeed, there seems to be a general shift in popular cinema toward women playing major characters in big-budget productions that have also achieved critical and commercial success. However, if, at first sight, the increase in the number of roles given to women suggests an evolution and a more positive image of them on the screen, it will be necessary to see how their representation is really characterized when they are young and occupy major roles in this type of film. In order to answer this question, we will rely on the results of research conducted on a corpus of 28 sports films in which a young woman plays the main role in the story. All of these productions are fictional (not documentary), mostly American, and distributed by major film studios. The chosen sports teen movies are among the biggest commercial successes of the genre and aim to make the maximum profit and occupy the most dominant positions within the "commercial pole" of the cinematic field. Therefore, this research will allow us, although a change has taken place in the last decades in the number of main roles granted to sportswomen, to decode the sociological subtext of these popular sports films for teenagers. The aim is to reveal how these sports films convey a conservative ideology that participates, on the one hand, in the maintenance of patriarchy and, on the other hand, in the dissemination of stereotyped, negative, and regressive images of young women athletes.

Keywords: cinema, sport, gender, youth, representations, inequality, stereotypes

Procedia PDF Downloads 66
951 Modeling Bessel Beams and Their Discrete Superpositions from the Generalized Lorenz-Mie Theory to Calculate Optical Forces over Spherical Dielectric Particles

Authors: Leonardo A. Ambrosio, Carlos. H. Silva Santos, Ivan E. L. Rodrigues, Ayumi K. de Campos, Leandro A. Machado

Abstract:

In this work, we propose an algorithm developed under Python language for the modeling of ordinary scalar Bessel beams and their discrete superpositions and subsequent calculation of optical forces exerted over dielectric spherical particles. The mathematical formalism, based on the generalized Lorenz-Mie theory, is implemented in Python for its large number of free mathematical (as SciPy and NumPy), data visualization (Matplotlib and PyJamas) and multiprocessing libraries. We also propose an approach, provided by a synchronized Software as Service (SaaS) in cloud computing, to develop a user interface embedded on a mobile application, thus providing users with the necessary means to easily introduce desired unknowns and parameters and see the graphical outcomes of the simulations right at their mobile devices. Initially proposed as a free Android-based application, such an App enables data post-processing in cloud-based architectures and visualization of results, figures and numerical tables.

Keywords: Bessel Beams and Frozen Waves, Generalized Lorenz-Mie Theory, Numerical Methods, optical forces

Procedia PDF Downloads 375
950 Evaluating India's Smart Cities against the Sustainable Development Goals

Authors: Suneet Jagdev

Abstract:

17 Sustainable Development Goals were adopted by the world leaders in September 2015 at the United Nations Sustainable Development Summit. These goals were adopted by UN member states to promote prosperity, health and human rights while protecting the planet. Around the same time, the Government of India launched the Smart City Initiative to speed up development of state of the art infrastructure and services in 100 cities with a focus on sustainable and inclusive development. These cities are meant to become role models for other cities in India and promote sustainable regional development. This paper examines goals set under the Smart City Initiative and evaluates them in terms of the Sustainable Development Goals, using case studies of selected Smart Cities in India. The study concludes that most Smart City projects at present actually consist of individual solutions to individual problems identified in a community rather than comprehensive models for complex issues in cities across India. Systematic, logical and comparative analysis of important literature and data has been done, collected from government sources, government papers, research papers by various experts on the topic, and results from some online surveys. Case studies have been used for a graphical analysis highlighting the issues of migration, ecology, economy and social equity in these Smart Cities.

Keywords: housing, migration, smart cities, sustainable development goals, urban infrastructure

Procedia PDF Downloads 405
949 Designing and Implementation of MPLS Based VPN

Authors: Muhammad Kamran Asif

Abstract:

MPLS stands for Multi-Protocol Label Switching. It is the technology which replaces ATM (Asynchronous Transfer Mode) and frame relay. In this paper, we have designed a full fledge small scale MPLS based service provider network core network model, which provides communication services (e.g. voice, video and data) to the customer more efficiently using label switching technique. Using MPLS VPN provides security to the customers which are either on LAN or WAN. It protects its single customer sites from being attacked by any intruder from outside world along with the provision of concept of extension of a private network over an internet. In this paper, we tried to implement a service provider network using minimum available resources i.e. five 3800 series CISCO routers comprises of service provider core, provider edge routers and customer edge routers. The customers on the one end of the network (customer side) is capable of sending any kind of data to the customers at the other end using service provider cloud which is MPLS VPN enabled. We have also done simulation and emulation for the model using GNS3 (Graphical Network Simulator-3) and achieved the real time scenarios. We have also deployed a NMS system which monitors our service provider cloud and generates alarm in case of any intrusion or malfunctioning in the network. Moreover, we have also provided a video help desk facility between customers and service provider cloud to resolve the network issues more effectively.

Keywords: MPLS, VPN, NMS, ATM, asynchronous transfer mode

Procedia PDF Downloads 328
948 Knowledge Diffusion via Automated Organizational Cartography (Autocart)

Authors: Mounir Kehal

Abstract:

The post-globalization epoch has placed businesses everywhere in new and different competitive situations where knowledgeable, effective and efficient behavior has come to provide the competitive and comparative edge. Enterprises have turned to explicit - and even conceptualizing on tacit - knowledge management to elaborate a systematic approach to develop and sustain the intellectual capital needed to succeed. To be able to do that, you have to be able to visualize your organization as consisting of nothing but knowledge and knowledge flows, whilst being presented in a graphical and visual framework, referred to as automated organizational cartography. Hence, creating the ability of further actively classifying existing organizational content evolving from and within data feeds, in an algorithmic manner, potentially giving insightful schemes and dynamics by which organizational know-how is visualized. It is discussed and elaborated on most recent and applicable definitions and classifications of knowledge management, representing a wide range of views from mechanistic (systematic, data driven) to a more socially (psychologically, cognitive/metadata driven) orientated. More elaborate continuum models, for knowledge acquisition and reasoning purposes, are being used for effectively representing the domain of information that an end user may contain in their decision making process for utilization of available organizational intellectual resources (i.e. Autocart). In this paper, we present an empirical research study conducted previously to try and explore knowledge diffusion in a specialist knowledge domain.

Keywords: knowledge management, knowledge maps, knowledge diffusion, organizational cartography

Procedia PDF Downloads 301
947 Application of Vector Representation for Revealing the Richness of Meaning of Facial Expressions

Authors: Carmel Sofer, Dan Vilenchik, Ron Dotsch, Galia Avidan

Abstract:

Studies investigating emotional facial expressions typically reveal consensus among observes regarding the meaning of basic expressions, whose number ranges between 6 to 15 emotional states. Given this limited number of discrete expressions, how is it that the human vocabulary of emotional states is so rich? The present study argues that perceivers use sequences of these discrete expressions as the basis for a much richer vocabulary of emotional states. Such mechanisms, in which a relatively small number of basic components is expanded to a much larger number of possible combinations of meanings, exist in other human communications modalities, such as spoken language and music. In these modalities, letters and notes, which serve as basic components of spoken language and music respectively, are temporally linked, resulting in the richness of expressions. In the current study, in each trial participants were presented with sequences of two images containing facial expression in different combinations sampled out of the eight static basic expressions (total 64; 8X8). In each trial, using single word participants were required to judge the 'state of mind' portrayed by the person whose face was presented. Utilizing word embedding methods (Global Vectors for Word Representation), employed in the field of Natural Language Processing, and relying on machine learning computational methods, it was found that the perceived meanings of the sequences of facial expressions were a weighted average of the single expressions comprising them, resulting in 22 new emotional states, in addition to the eight, classic basic expressions. An interaction between the first and the second expression in each sequence indicated that every single facial expression modulated the effect of the other facial expression thus leading to a different interpretation ascribed to the sequence as a whole. These findings suggest that the vocabulary of emotional states conveyed by facial expressions is not restricted to the (small) number of discrete facial expressions. Rather, the vocabulary is rich, as it results from combinations of these expressions. In addition, present research suggests that using word embedding in social perception studies, can be a powerful, accurate and efficient tool, to capture explicit and implicit perceptions and intentions. Acknowledgment: The study was supported by a grant from the Ministry of Defense in Israel to GA and CS. CS is also supported by the ABC initiative in Ben-Gurion University of the Negev.

Keywords: Glove, face perception, facial expression perception. , facial expression production, machine learning, word embedding, word2vec

Procedia PDF Downloads 174
946 The Processing of Implicit Stereotypes in Contexts of Reading, Using Eye-Tracking and Self-Paced Reading Tasks

Authors: Magali Mari, Misha Muller

Abstract:

The present study’s objectives were to determine how diverse implicit stereotypes affect the processing of written information and linguistic inferential processes, such as presupposition accommodation. When reading a text, one constructs a representation of the described situation, which is then updated, according to new outputs and based on stereotypes inscribed within society. If the new output contradicts stereotypical expectations, the representation must be corrected, resulting in longer reading times. A similar process occurs in cases of linguistic inferential processes like presupposition accommodation. Presupposition accommodation is traditionally regarded as fast, automatic processing of background information (e.g., ‘Mary stopped eating meat’ is quickly processed as Mary used to eat meat). However, very few accounts have investigated if this process is likely to be influenced by domains of social cognition, such as implicit stereotypes. To study the effects of implicit stereotypes on presupposition accommodation, adults were recorded while they read sentences in French, combining two methods, an eye-tracking task and a classic self-paced reading task (where participants read sentence segments at their own pace by pressing a computer key). In one condition, presuppositions were activated with the French definite articles ‘le/la/les,’ whereas in the other condition, the French indefinite articles ‘un/une/des’ was used, triggering no presupposition. Using a definite article presupposes that the object has already been uttered and is thus part of background information, whereas using an indefinite article is understood as the introduction of new information. Two types of stereotypes were under examination in order to enlarge the scope of stereotypes traditionally analyzed. Study 1 investigated gender stereotypes linked to professional occupations to replicate previous findings. Study 2 focused on nationality-related stereotypes (e.g. ‘the French are seducers’ versus ‘the Japanese are seducers’) to determine if the effects of implicit stereotypes on reading are generalizable to other types of implicit stereotypes. The results show that reading is influenced by the two types of implicit stereotypes; in the two studies, the reading pace slowed down when a counter-stereotype was presented. However, presupposition accommodation did not affect participants’ processing of information. Altogether these results show that (a) implicit stereotypes affect the processing of written information, regardless of the type of stereotypes presented, and (b) that implicit stereotypes prevail over the superficial linguistic treatment of presuppositions, which suggests faster processing for treating social information compared to linguistic information.

Keywords: eye-tracking, implicit stereotypes, reading, social cognition

Procedia PDF Downloads 190
945 Online Handwritten Character Recognition for South Indian Scripts Using Support Vector Machines

Authors: Steffy Maria Joseph, Abdu Rahiman V, Abdul Hameed K. M.

Abstract:

Online handwritten character recognition is a challenging field in Artificial Intelligence. The classification success rate of current techniques decreases when the dataset involves similarity and complexity in stroke styles, number of strokes and stroke characteristics variations. Malayalam is a complex south indian language spoken by about 35 million people especially in Kerala and Lakshadweep islands. In this paper, we consider the significant feature extraction for the similar stroke styles of Malayalam. This extracted feature set are suitable for the recognition of other handwritten south indian languages like Tamil, Telugu and Kannada. A classification scheme based on support vector machines (SVM) is proposed to improve the accuracy in classification and recognition of online malayalam handwritten characters. SVM Classifiers are the best for real world applications. The contribution of various features towards the accuracy in recognition is analysed. Performance for different kernels of SVM are also studied. A graphical user interface has developed for reading and displaying the character. Different writing styles are taken for each of the 44 alphabets. Various features are extracted and used for classification after the preprocessing of input data samples. Highest recognition accuracy of 97% is obtained experimentally at the best feature combination with polynomial kernel in SVM.

Keywords: SVM, matlab, malayalam, South Indian scripts, onlinehandwritten character recognition

Procedia PDF Downloads 568
944 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion

Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You

Abstract:

For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.

Keywords: case-based reasoning, internal thread, cold extrusion, process planning

Procedia PDF Downloads 504
943 Field-Programmable Gate Array Based Tester for Protective Relay

Authors: H. Bentarzi, A. Zitouni

Abstract:

The reliability of the power grid depends on the successful operation of thousands of protective relays. The failure of one relay to operate as intended may lead the entire power grid to blackout. In fact, major power system failures during transient disturbances may be caused by unnecessary protective relay tripping rather than by the failure of a relay to operate. Adequate relay testing provides a first defense against false trips of the relay and hence improves power grid stability and prevents catastrophic bulk power system failures. The goal of this research project is to design and enhance the relay tester using a technology such as Field Programmable Gate Array (FPGA) card NI 7851. A PC based tester framework has been developed using Simulink power system model for generating signals under different conditions (faults or transient disturbances) and LabVIEW for developing the graphical user interface and configuring the FPGA. Besides, the interface system has been developed for outputting and amplifying the signals without distortion. These signals should be like the generated ones by the real power system and large enough for testing the relay’s functionality. The signals generated that have been displayed on the scope are satisfactory. Furthermore, the proposed testing system can be used for improving the performance of protective relay.

Keywords: amplifier class D, field-programmable gate array (FPGA), protective relay, tester

Procedia PDF Downloads 212