Search results for: parallel data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25735

Search results for: parallel data mining

22735 How Validated Nursing Workload and Patient Acuity Data Can Promote Sustained Change and Improvements within District Health Boards. the New Zealand Experience

Authors: Rebecca Oakes

Abstract:

In the New Zealand public health system, work has been taking place to use electronic systems to convey data from the ‘floor to the board’ that makes patient needs, and therefore nursing work, visible. For nurses, these developments in health information technology puts us in a very new and exciting position of being able to articulate the work of nursing through a language understood at all levels of an organisation, the language of acuity. Nurses increasingly have a considerable stake-hold in patient acuity data. Patient acuity systems, when used well, can assist greatly in demonstrating how much work is required, the type of work, and when it will be required. The New Zealand Safe Staffing Unit is supporting New Zealand nurses to create a culture of shared governance, where nursing data is informing policies, staffing methodologies and forecasting within their organisations. Assisting organisations to understand their acuity data, strengthening user confidence in using electronic patient acuity systems, and ensuring nursing and midwifery workload is accurately reflected is critical to the success of the safe staffing programme. Nurses and midwives have the capacity via an acuity tool to become key informers of organisational planning. Quality patient care, best use of health resources and a quality work environment are essential components of a safe, resilient and well resourced organisation. Nurses are the key informers of this information. In New Zealand a national level approach is paving the way for significant changes to the understanding and use of patient acuity and nursing workload information.

Keywords: nursing workload, patient acuity, safe staffing, New Zealand

Procedia PDF Downloads 365
22734 Embedded Hybrid Intuition: A Deep Learning and Fuzzy Logic Approach to Collective Creation and Computational Assisted Narratives

Authors: Roberto Cabezas H

Abstract:

The current work shows the methodology developed to create narrative lighting spaces for the multimedia performance piece 'cluster: the vanished paradise.' This empirical research is focused on exploring unconventional roles for machines in subjective creative processes, by delving into the semantics of data and machine intelligence algorithms in hybrid technological, creative contexts to expand epistemic domains trough human-machine cooperation. The creative process in scenic and performing arts is guided mostly by intuition; from that idea, we developed an approach to embed collective intuition in computational creative systems, by joining the properties of Generative Adversarial Networks (GAN’s) and Fuzzy Clustering based on a semi-supervised data creation and analysis pipeline. The model makes use of GAN’s to learn from phenomenological data (data generated from experience with lighting scenography) and algorithmic design data (augmented data by procedural design methods), fuzzy logic clustering is then applied to artificially created data from GAN’s to define narrative transitions built on membership index; this process allowed for the creation of simple and complex spaces with expressive capabilities based on position and light intensity as the parameters to guide the narrative. Hybridization comes not only from the human-machine symbiosis but also on the integration of different techniques for the implementation of the aided design system. Machine intelligence tools as proposed in this work are well suited to redefine collaborative creation by learning to express and expand a conglomerate of ideas and a wide range of opinions for the creation of sensory experiences. We found in GAN’s and Fuzzy Logic an ideal tool to develop new computational models based on interaction, learning, emotion and imagination to expand the traditional algorithmic model of computation.

Keywords: fuzzy clustering, generative adversarial networks, human-machine cooperation, hybrid collective data, multimedia performance

Procedia PDF Downloads 129
22733 Thin Films of Glassy Carbon Prepared by Cluster Deposition

Authors: Hatem Diaf, Patrice Melinon, Antonio Pereira, Bernard Moine, Nicholas Blanchard, Florent Bourquard, Florence Garrelie, Christophe Donnet

Abstract:

Glassy carbon exhibits excellent biological compatibility with live tissues meaning it has high potential for applications in life science. Moreover, glassy carbon has interesting properties including 'high temperature resistance', hardness, low density, low electrical resistance, low friction, and low thermal resistance. The structure of glassy carbon has long been a subject of debate. It is now admitted that glassy carbon is 100% sp2. This term is a little bit confusing as long sp2 hybridization defined from quantum chemistry is related to both properties: threefold configuration and pi bonding (parallel pz orbitals). Using plasma laser deposition of carbon clusters combined with pulsed nano/femto laser annealing, we are able to synthesize thin films of glassy carbon of good quality (probed by G band/ D disorder band ratio in Raman spectroscopy) without thermal post annealing. A careful inspecting of Raman signal, plasmon losses and structure performed by HRTEM (High Resolution Transmission Electron Microscopy) reveals that both properties (threefold and pi orbitals) cannot coexist together. The structure of the films is compared to models including schwarzites based from negatively curved surfaces at the opposite of onions or fullerene-like structures with positively curved surfaces. This study shows that a huge collection of porous carbon named vitreous carbon with different structures can coexist.

Keywords: glassy carbon, cluster deposition, coating, electronic structure

Procedia PDF Downloads 302
22732 Strategic Mine Planning: A SWOT Analysis Applied to KOV Open Pit Mine in the Democratic Republic of Congo

Authors: Patrick May Mukonki

Abstract:

KOV pit (Kamoto Oliveira Virgule) is located 10 km from Kolwezi town, one of the mineral rich town in the Lualaba province of the Democratic Republic of Congo. The KOV pit is currently operating under the Katanga Mining Limited (KML), a Glencore-Gecamines (a State Owned Company) join venture. Recently, the mine optimization process provided a life of mine of approximately 10 years withnice pushbacks using the Datamine NPV Scheduler software. In previous KOV pit studies, we recently outlined the impact of the accuracy of the geological information on a long-term mine plan for a big copper mine such as KOV pit. The approach taken, discussed three main scenarios and outlined some weaknesses on the geological information side, and now, in this paper that we are going to develop here, we are going to highlight, as an overview, those weaknesses, strengths and opportunities, in a global SWOT analysis. The approach we are taking here is essentially descriptive in terms of steps taken to optimize KOV pit and, at every step, we categorized the challenges we faced to have a better tradeoff between what we called strengths and what we called weaknesses. The same logic is applied in terms of the opportunities and threats. The SWOT analysis conducted in this paper demonstrates that, despite a general poor ore body definition, and very rude ground water conditions, there is room for improvement for such high grade ore body.

Keywords: mine planning, mine optimization, mine scheduling, SWOT analysis

Procedia PDF Downloads 211
22731 Acoustic Modeling of a Data Center with a Hot Aisle Containment System

Authors: Arshad Alfoqaha, Seth Bard, Dustin Demetriou

Abstract:

A new multi-physics acoustic modeling approach using ANSYS Mechanical FEA and FLUENT CFD methods is developed for modeling servers mounted to racks, such as IBM Z and IBM Power Systems, in data centers. This new approach allows users to determine the thermal and acoustic conditions that people are exposed to within the data center. The sound pressure level (SPL) exposure for a human working inside a hot aisle containment system inside the data center is studied. The SPL is analyzed at the noise source, at the human body, on the rack walls, on the containment walls, and on the ceiling and flooring plenum walls. In the acoustic CFD simulation, it is assumed that a four-inch diameter sphere with monopole acoustic radiation, placed in the middle of each rack, provides a single-source representation of all noise sources within the rack. Ffowcs Williams & Hawkings (FWH) acoustic model is employed. The target frequency is 1000 Hz, and the total simulation time for the transient analysis is 1.4 seconds, with a very small time step of 3e-5 seconds and 10 iterations to ensure convergence and accuracy. A User Defined Function (UDF) is developed to accurately simulate the acoustic noise source, and a Dynamic Mesh is applied to ensure acoustic wave propagation. Initial validation of the acoustic CFD simulation using a closed-form solution for the spherical propagation of an acoustic point source is performed.

Keywords: data centers, FLUENT, acoustics, sound pressure level, SPL, hot aisle containment, IBM

Procedia PDF Downloads 159
22730 Long Term Evolution Multiple-Input Multiple-Output Network in Unmanned Air Vehicles Platform

Authors: Ashagrie Getnet Flattie

Abstract:

Line-of-sight (LOS) information, data rates, good quality, and flexible network service are limited by the fact that, for the duration of any given connection, they experience severe variation in signal strength due to fading and path loss. Wireless system faces major challenges in achieving wide coverage and capacity without affecting the system performance and to access data everywhere, all the time. In this paper, the cell coverage and edge rate of different Multiple-input multiple-output (MIMO) schemes in 20 MHz Long Term Evolution (LTE) system under Unmanned Air Vehicles (UAV) platform are investigated. After some background on the enormous potential of UAV, MIMO, and LTE in wireless links, the paper highlights the presented system model which attempts to realize the various benefits of MIMO being incorporated into UAV platform. The performances of the three MIMO LTE schemes are compared with the performance of 4x4 MIMO LTE in UAV scheme carried out to evaluate the improvement in cell radius, BER, and data throughput of the system in different morphology. The results show that significant performance gains such as bit error rate (BER), data rate, and coverage can be achieved by using the presented scenario.

Keywords: LTE, MIMO, path loss, UAV

Procedia PDF Downloads 261
22729 Improvement of Ground Truth Data for Eye Location on Infrared Driver Recordings

Authors: Sorin Valcan, Mihail Gaianu

Abstract:

Labeling is a very costly and time consuming process which aims to generate datasets for training neural networks in several functionalities and projects. For driver monitoring system projects, the need for labeled images has a significant impact on the budget and distribution of effort. This paper presents the modifications done to an algorithm used for the generation of ground truth data for 2D eyes location on infrared images with drivers in order to improve the quality of the data and performance of the trained neural networks. The algorithm restrictions become tougher, which makes it more accurate but also less constant. The resulting dataset becomes smaller and shall not be altered by any kind of manual label adjustment before being used in the neural networks training process. These changes resulted in a much better performance of the trained neural networks.

Keywords: labeling automation, infrared camera, driver monitoring, eye detection, convolutional neural networks

Procedia PDF Downloads 93
22728 A Consideration on the Offset Frontal Impact Modeling Using Spring-Mass Model

Authors: Jaemoon Lim

Abstract:

To construct the lumped spring-mass model considering the occupants for the offset frontal crash, the SISAME software and the NHTSA test data were used. The data on 56 kph 40% offset frontal vehicle to deformable barrier crash test of a MY2007 Mazda 6 4-door sedan were obtained from NHTSA test database. The overall behaviors of B-pillar and engine of simulation models agreed very well with the test data. The trends of accelerations at the driver and passenger head were similar but big differences in peak values. The differences of peak values caused the large errors of the HIC36 and 3 ms chest g’s. To predict well the behaviors of dummies, the spring-mass model for the offset frontal crash needs to be improved.

Keywords: chest g’s, HIC36, lumped spring-mass model, offset frontal impact, SISAME

Procedia PDF Downloads 441
22727 Quantifying Spatiotemporal Patterns of Past and Future Urbanization Trends in El Paso, Texas and Their Impact on Electricity Consumption

Authors: Joanne Moyer

Abstract:

El Paso, Texas is a southwest border city that has experienced continuous growth within the last 15-years. Understanding the urban growth trends and patterns using data from the National Land Cover Database (NLCD) and landscape metrics, provides a quantitative description of growth. Past urban growth provided a basis to predict 2031 future land-use for El Paso using the CA-Markov model. As a consequence of growth, an increase in demand of resources follows. Using panel data analysis, an understanding of the relation between landscape metrics and electricity consumption is further analyzed. The studies’ findings indicate that past growth focused within three districts within the City of El Paso. The landscape metrics suggest as the city has grown, fragmentation has decreased. Alternatively, the landscape metrics for the projected 2031 land-use indicates possible fragmentation within one of these districts. Panel data suggests electricity consumption and mean patch area landscape metric are positively correlated. The study provides local decision makers to make informed decisions for policies and urban planning to ensure a future sustainable community.

Keywords: landscape metrics, CA-Markov, El Paso, Texas, panel data

Procedia PDF Downloads 128
22726 Short Answer Grading Using Multi-Context Features

Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan

Abstract:

Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.

Keywords: artificial intelligence, intelligent systems, natural language processing, text mining

Procedia PDF Downloads 120
22725 Critically Sampled Hybrid Trigonometry Generalized Discrete Fourier Transform for Multistandard Receiver Platform

Authors: Temidayo Otunniyi

Abstract:

This paper presents a low computational channelization algorithm for the multi-standards platform using poly phase implementation of a critically sampled hybrid Trigonometry generalized Discrete Fourier Transform, (HGDFT). An HGDFT channelization algorithm exploits the orthogonality of two trigonometry Fourier functions, together with the properties of Quadrature Mirror Filter Bank (QMFB) and Exponential Modulated filter Bank (EMFB), respectively. HGDFT shows improvement in its implementation in terms of high reconfigurability, lower filter length, parallelism, and medium computational activities. Type 1 and type 111 poly phase structures are derived for real-valued HGDFT modulation. The design specifications are decimated critically and over-sampled for both single and multi standards receiver platforms. Evaluating the performance of oversampled single standard receiver channels, the HGDFT algorithm achieved 40% complexity reduction, compared to 34% and 38% reduction in the Discrete Fourier Transform (DFT) and tree quadrature mirror filter (TQMF) algorithm. The parallel generalized discrete Fourier transform (PGDFT) and recombined generalized discrete Fourier transform (RGDFT) had 41% complexity reduction and HGDFT had a 46% reduction in oversampling multi-standards mode. While in the critically sampled multi-standard receiver channels, HGDFT had complexity reduction of 70% while both PGDFT and RGDFT had a 34% reduction.

Keywords: software defined radio, channelization, critical sample rate, over-sample rate

Procedia PDF Downloads 115
22724 Risk of Heatstroke Occurring in Indoor Built Environment Determined with Nationwide Sports and Health Database and Meteorological Outdoor Data

Authors: Go Iwashita

Abstract:

The paper describes how the frequencies of heatstroke occurring in indoor built environment are related to the outdoor thermal environment with big statistical data. As the statistical accident data of heatstroke, the nationwide accident data were obtained from the National Agency for the Advancement of Sports and Health (NAASH) . The meteorological database of the Japanese Meteorological Agency supplied data about 1-hour average temperature, humidity, wind speed, solar radiation, and so forth. Each heatstroke data point from the NAASH database was linked to the meteorological data point acquired from the nearest meteorological station where the accident of heatstroke occurred. This analysis was performed for a 10-year period (2005–2014). During the 10-year period, 3,819 cases of heatstroke were reported in the NAASH database for the investigated secondary/high schools of the nine Japanese representative cities. Heatstroke most commonly occurred in the outdoor schoolyard at a wet-bulb globe temperature (WBGT) of 31°C and in the indoor gymnasium during athletic club activities at a WBGT > 31°C. The determined accident ratio (number of accidents during each club activity divided by the club’s population) in the gymnasium during the female badminton club activities was the highest. Although badminton is played in a gymnasium, these WBGT results show that the risk level during badminton under hot and humid conditions is equal to that of baseball or rugby played in the schoolyard. Except sports, the high risk of heatstroke was observed in schools houses during cultural activities. The risk level for indoor environment under hot and humid condition would be equal to that for outdoor environment based on the above results of WBGT. Therefore control measures against hot and humid indoor condition were needed as installing air conditions not only schools but also residences.

Keywords: accidents in schools, club activity, gymnasium, heatstroke

Procedia PDF Downloads 204
22723 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data

Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani

Abstract:

Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.

Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry

Procedia PDF Downloads 208
22722 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings

Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey

Abstract:

Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.

Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing

Procedia PDF Downloads 136
22721 Autism Spectrum Disorder Classification Algorithm Using Multimodal Data Based on Graph Convolutional Network

Authors: Yuntao Liu, Lei Wang, Haoran Xia

Abstract:

Machine learning has shown extensive applications in the development of classification models for autism spectrum disorder (ASD) using neural image data. This paper proposes a fusion multi-modal classification network based on a graph neural network. First, the brain is segmented into 116 regions of interest using a medical segmentation template (AAL, Anatomical Automatic Labeling). The image features of sMRI and the signal features of fMRI are extracted, which build the node and edge embedding representations of the brain map. Then, we construct a dynamically updated brain map neural network and propose a method based on a dynamic brain map adjacency matrix update mechanism and learnable graph to further improve the accuracy of autism diagnosis and recognition results. Based on the Autism Brain Imaging Data Exchange I dataset(ABIDE I), we reached a prediction accuracy of 74% between ASD and TD subjects. Besides, to study the biomarkers that can help doctors analyze diseases and interpretability, we used the features by extracting the top five maximum and minimum ROI weights. This work provides a meaningful way for brain disorder identification.

Keywords: autism spectrum disorder, brain map, supervised machine learning, graph network, multimodal data, model interpretability

Procedia PDF Downloads 42
22720 Evaluating the Baseline Chatacteristics of Static Balance in Young Adults

Authors: K. Abuzayan, H. Alabed

Abstract:

The objectives of this study (baseline study, n = 20) were to implement Matlab procedures for quantifying selected static balance variables, establish baseline data of selected variables which characterize static balance activities in a population of healthy young adult males, and to examine any trial effects on these variables. The results indicated that the implementation of Matlab procedures for quantifying selected static balance variables was practical and enabled baseline data to be established for selected variables. There was no significant trial effect. Recommendations were made for suitable tests to be used in later studies. Specifically it was found that one foot-tiptoes tests either in static balance is too challenging for most participants in normal circumstances. A one foot-flat eyes open test was considered to be representative and challenging for static balance.

Keywords: static balance, base of support, baseline data, young adults

Procedia PDF Downloads 509
22719 Information in Public Domain: How Far It Measures Government's Accountability

Authors: Sandip Mitra

Abstract:

Studies on Governance and Accountability has often stressed the need to release Data in public domain to increase transparency ,which otherwise act as an evidence of performance. However, inefficient handling, lack of capacity and the dynamics of transfers (especially fund transfers) are important issues which need appropriate attention. E-Governance alone can not serve as a measure of transparency as long as a comprehensive planning is instituted. Studies on Governance and public exposure has often triggered public opinion in favour or against any government. The root of the problem (especially in local governments) lies in the management of the governance. The participation of the people in the local government functioning, the networks within and outside the locality, synergy with various layers of Government are crucial in understanding the activities of any government. Unfortunately, data on such issues are not released in the public domain .If they are at all released , the extraction of information is often hindered for complicated designs. A Study has been undertaken with a few local Governments in India. The data has been analysed to substantiate the views.

Keywords: accountability, e-governance, transparency, local government

Procedia PDF Downloads 417
22718 Human Rights to Environment: The Constitutional and Judicial Perspective in India

Authors: Varinder Singh

Abstract:

The primitive man had not known anything like human rights. In the later centuries of human progress with the development of scientific and technological knowledge, the growth of population and the tremendous changes in the human environment, the laws of nature that maintained the Eco-balance crumbled. The race for better and comfortable life landed mankind in a vicious circle. It created environmental imbalance, unplanned and uneven development, breakdown of self-sustaining village economy, mushrooming of shanty towns and slums, widening the chasm between the rich and the poor, over-exploitation of natural resources, desertification of arable lands, pollution of different kinds, heating up of earth and depletion of ozone layer. Modem International Life has been deeply marked and transformed by current endeavors to meet the needs and fulfill the requirements of protection of human person and of the environment. Such endeavors have been encouraged by the widespread recognition that protection of human being and the environment reflects common superior values and constitutes a common concern of mankind. The parallel evolutions of human rights protection and environmental protection disclose some close affinities. There was the occurrence of process of internationalization of both human rights protection and environmental protection, the former beginning with the 1948 Universal Declaration of Human Rights, the latter with the 1972 Stockholm Declaration on the Human Environment.It is now well established that it is the basic human right of every individual to live in a pollution free environment with full human dignity. The judiciary has so far pronounced a number of judgments in this regard. The Supreme Court in view of various laws relating to environment protection and the constitutional provision has held that right to pollution free environment. Article-21 is the heart of the fundamental rights and has received expanded meanings from time to time.

Keywords: human rights, law, environment, polluter

Procedia PDF Downloads 211
22717 Aseismic Stiffening of Architectural Buildings as Preventive Restoration Using Unconventional Materials

Authors: Jefto Terzovic, Ana Kontic, Isidora Ilic

Abstract:

In the proposed design concept, laminated glass and laminated plexiglass, as ”unconventional materials”, are considered as a filling in a steel frame on which they overlap by the intermediate rubber layer, thereby forming a composite assembly. In this way vertical elements of stiffening are formed, capable for reception of seismic force and integrated into the structural system of the building. The applicability of such a system was verified by experiments in laboratory conditions where the experimental models based on laminated glass and laminated plexiglass had been exposed to the cyclic loads that simulate the seismic force. In this way the load capacity of composite assemblies was tested for the effects of dynamic load that was parallel to assembly plane. Thus, the stress intensity to which composite systems might be exposed was determined as well as the range of the structure stiffening referring to the expressed deformation along with the advantages of a particular type of filling compared to the other one. Using specialized software whose operation is based on the finite element method, a computer model of the structure was created and processed in the case study; the same computer model was used for analyzing the problem in the first phase of the design process. The stiffening system based on composite assemblies tested in laboratories is implemented in the computer model. The results of the modal analysis and seismic calculation from the computer model with stiffeners applied showed an efficacy of such a solution, thus rounding the design procedures for aseismic stiffening by using unconventional materials.

Keywords: laminated glass, laminated plexiglass, aseismic stiffening, experiment, laboratory testing, computer model, finite element method

Procedia PDF Downloads 65
22716 Synergy Effect of Energy and Water Saving in China's Energy Sectors: A Multi-Objective Optimization Analysis

Authors: Yi Jin, Xu Tang, Cuiyang Feng

Abstract:

The ‘11th five-year’ and ‘12th five-year’ plans have clearly put forward to strictly control the total amount and intensity of energy and water consumption. The synergy effect of energy and water has rarely been considered in the process of energy and water saving in China, where its contribution cannot be maximized. Energy sectors consume large amounts of energy and water when producing massive energy, which makes them both energy and water intensive. Therefore, the synergy effect in these sectors is significant. This paper assesses and optimizes the synergy effect in three energy sectors under the background of promoting energy and water saving. Results show that: From the perspective of critical path, chemical industry, mining and processing of non-metal ores and smelting and pressing of metals are coupling points in the process of energy and water flowing to energy sectors, in which the implementation of energy and water saving policies can bring significant synergy effect. Multi-objective optimization shows that increasing efforts on input restructuring can effectively improve synergy effects; relatively large synergetic energy saving and little water saving are obtained after solely reducing the energy and water intensity of coupling sectors. By optimizing the input structure of sectors, especially the coupling sectors, the synergy effect of energy and water saving can be improved in energy sectors under the premise of keeping economy running stably.

Keywords: critical path, energy sector, multi-objective optimization, synergy effect, water

Procedia PDF Downloads 349
22715 Depth to Basement Determination Sculpting of a Magnetic Mineral Using Magnetic Survey

Authors: A. Ikusika, O. I. Poppola

Abstract:

This study was carried out to delineate possible structures that may favour the accumulation of tantalite, a magnetic mineral. A ground based technique was employed using proton precision magnetometer G-856 AX. A total of ten geophysical traverses were established in the study area. The acquired magnetic field data were corrected for drift. The trend analysis was adopted to remove the regional gradient from the observed data and the resulting results were presented as profiles. Quantitative interpretation only was adopted to obtain the depth to basement using Peter half slope method. From the geological setting of the area and the information obtained from the magnetic survey, a conclusion can be made that the study area is underlain by a rock unit of accumulated minerals. It is therefore suspected that the overburden is relatively thin within the study area and the metallic minerals are in disseminated quantity and at a shallow depth.

Keywords: basement, drift, magnetic field data, tantalite, traverses

Procedia PDF Downloads 465
22714 Social Media Resignation the Only Way to Protect User Data and Restore Cognitive Balance, a Literature Review

Authors: Rajarshi Motilal

Abstract:

The birth of the Internet and the rise of social media marked an important chapter in the history of humankind. Often termed the fourth scientific revolution, the Internet has changed human lives and cognisance. The birth of Web 2.0, followed by the launch of social media and social networking sites, added another milestone to these technological advancements where connectivity and influx of information became dominant. With billions of individuals using the internet and social media sites in the 21st century, “users” became “consumers”, and orthodox marketing reshaped itself to digital marketing. Furthermore, organisations started using sophisticated algorithms to predict consumer purchase behaviour and manipulate it to sustain themselves in such a competitive environment. The rampant storage and analysis of individual data became the new normal, raising many questions about data privacy. The excessive usage of the Internet among individuals brought in other problems of them becoming addicted to it, scavenging for societal approval and instant gratification, subsequently leading to a collective dualism, isolation, and finally, depression. This study aims to determine the relationship between social media usage in the modern age and the rise of psychological and cognitive imbalances in human minds. The literature review is positioned timely as an addition to the existing work at a time when the world is constantly debating on whether social media resignation is the only way to protect user data and restore the decaying cognitive balance.

Keywords: social media, digital marketing, consumer behaviour, internet addiction, data privacy

Procedia PDF Downloads 63
22713 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium

Authors: Janne Engblom, Elias Oikarinen

Abstract:

The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.

Keywords: dynamic model, panel data, cross-sectional dependence, interaction model

Procedia PDF Downloads 240
22712 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider

Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón

Abstract:

The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.

Keywords: AD0, ALICE, DCS, LHC

Procedia PDF Downloads 293
22711 Nano Generalized Topology

Authors: M. Y. Bakeir

Abstract:

Rough set theory is a recent approach for reasoning about data. It has achieved a large amount of applications in various real-life fields. The main idea of rough sets corresponds to the lower and upper set approximations. These two approximations are exactly the interior and the closure of the set with respect to a certain topology on a collection U of imprecise data acquired from any real-life field. The base of the topology is formed by equivalence classes of an equivalence relation E defined on U using the available information about data. The theory of generalized topology was studied by Cs´asz´ar. It is well known that generalized topology in the sense of Cs´asz´ar is a generalization of the topology on a set. On the other hand, many important collections of sets related with the topology on a set form a generalized topology. The notion of Nano topology was introduced by Lellis Thivagar, which was defined in terms of approximations and boundary region of a subset of an universe using an equivalence relation on it. The purpose of this paper is to introduce a new generalized topology in terms of rough set called nano generalized topology

Keywords: rough sets, topological space, generalized topology, nano topology

Procedia PDF Downloads 419
22710 Algorithm for Recognizing Trees along Power Grid Using Multispectral Imagery

Authors: C. Hamamura, V. Gialluca

Abstract:

Much of the Eclectricity Distributors has about 70% of its electricity interruptions arising from cause "trees", alone or associated with wind and rain and with or without falling branch and / or trees. This contributes inexorably and significantly to outages, resulting in high costs as compensation in addition to the operation and maintenance costs. On the other hand, there is little data structure and solutions to better organize the trees pruning plan effectively, minimizing costs and environmentally friendly. This work describes the development of an algorithm to provide data of trees associated to power grid. The method is accomplished on several steps using satellite imagery and geographically vectorized grid. A sliding window like approach is performed to seek the area around the grid. The proposed method counted 764 trees on a patch of the grid, which was very close to the 738 trees counted manually. The trees data was used as a part of a larger project that implements a system to optimize tree pruning plan.

Keywords: image pattern recognition, trees pruning, trees recognition, neural network

Procedia PDF Downloads 484
22709 Breast Cancer Incidence Estimation in Castilla-La Mancha (CLM) from Mortality and Survival Data

Authors: C. Romero, R. Ortega, P. Sánchez-Camacho, P. Aguilar, V. Segur, J. Ruiz, G. Gutiérrez

Abstract:

Introduction: Breast cancer is a leading cause of death in CLM. (2.8% of all deaths in women and 13,8% of deaths from tumors in womens). It is the most tumor incidence in CLM region with 26.1% from all tumours, except nonmelanoma skin (Cancer Incidence in Five Continents, Volume X, IARC). Cancer registries are a good information source to estimate cancer incidence, however the data are usually available with a lag which makes difficult their use for health managers. By contrast, mortality and survival statistics have less delay. In order to serve for resource planning and responding to this problem, a method is presented to estimate the incidence of mortality and survival data. Objectives: To estimate the incidence of breast cancer by age group in CLM in the period 1991-2013. Comparing the data obtained from the model with current incidence data. Sources: Annual number of women by single ages (National Statistics Institute). Annual number of deaths by all causes and breast cancer. (Mortality Registry CLM). The Breast cancer relative survival probability. (EUROCARE, Spanish registries data). Methods: A Weibull Parametric survival model from EUROCARE data is obtained. From the model of survival, the population and population data, Mortality and Incidence Analysis MODel (MIAMOD) regression model is obtained to estimate the incidence of cancer by age (1991-2013). Results: The resulting model is: Ix,t = Logit [const + age1*x + age2*x2 + coh1*(t – x) + coh2*(t-x)2] Where: Ix,t is the incidence at age x in the period (year) t; the value of the parameter estimates is: const (constant term in the model) = -7.03; age1 = 3.31; age2 = -1.10; coh1 = 0.61 and coh2 = -0.12. It is estimated that in 1991 were diagnosed in CLM 662 cases of breast cancer (81.51 per 100,000 women). An estimated 1,152 cases (112.41 per 100,000 women) were diagnosed in 2013, representing an increase of 40.7% in gross incidence rate (1.9% per year). The annual average increases in incidence by age were: 2.07% in women aged 25-44 years, 1.01% (45-54 years), 1.11% (55-64 years) and 1.24% (65-74 years). Cancer registries in Spain that send data to IARC declared 2003-2007 the average annual incidence rate of 98.6 cases per 100,000 women. Our model can obtain an incidence of 100.7 cases per 100,000 women. Conclusions: A sharp and steady increase in the incidence of breast cancer in the period 1991-2013 is observed. The increase was seen in all age groups considered, although it seems more pronounced in young women (25-44 years). With this method you can get a good estimation of the incidence.

Keywords: breast cancer, incidence, cancer registries, castilla-la mancha

Procedia PDF Downloads 297
22708 Mining the Proteome of Fusobacterium nucleatum for Potential Therapeutics Discovery

Authors: Abdul Musaweer Habib, Habibul Hasan Mazumder, Saiful Islam, Sohel Sikder, Omar Faruk Sikder

Abstract:

The plethora of genome sequence information of bacteria in recent times has ushered in many novel strategies for antibacterial drug discovery and facilitated medical science to take up the challenge of the increasing resistance of pathogenic bacteria to current antibiotics. In this study, we adopted subtractive genomics approach to analyze the whole genome sequence of the Fusobacterium nucleatum, a human oral pathogen having association with colorectal cancer. Our study divulged 1499 proteins of Fusobacterium nucleatum, which has no homolog in human genome. These proteins were subjected to screening further by using the Database of Essential Genes (DEG) that resulted in the identification of 32 vitally important proteins for the bacterium. Subsequent analysis of the identified pivotal proteins, using the KEGG Automated Annotation Server (KAAS) resulted in sorting 3 key enzymes of F. nucleatum that may be good candidates as potential drug targets, since they are unique for the bacterium and absent in humans. In addition, we have demonstrated the 3-D structure of these three proteins. Finally, determination of ligand binding sites of the key proteins as well as screening for functional inhibitors that best fitted with the ligands sites were conducted to discover effective novel therapeutic compounds against Fusobacterium nucleatum.

Keywords: colorectal cancer, drug target, Fusobacterium nucleatum, homology modeling, ligands

Procedia PDF Downloads 369
22707 The Image Redefinition of Urban Destinations: The Case of Madrid and Barcelona

Authors: Montserrat Crespi Vallbona, Marta Domínguez Pérez

Abstract:

Globalization impacts on cities and especially on their centers, especially on those spaces more visible and coveted. Changes are involved in processes such as touristification, gentrification or studentification, in addition of shop trendiness. The city becomes a good of interchange rather than a communal good for its inhabitants and consequently, its value is monetized. So, these different tendencies are analyzed: on one hand, the presence of tourists, the home rental increase, the explosion of businesses related to tourism; on the other hand; the return of middle classes or gentries to the center in a socio-spatial model that has changed highlighting the centers by their culture and their opportunities as well as by the value of public space and centrality; then, the interest of students (national and international) to be part of these city centers as dynamic groups and emerging classes with a higher purchasing power and better cultural capital than in the past; and finally, the conversion of old stores into modern ones, where vintage trend and the renewal of antiquity is the essence. All these transforming processes impact the European cities and redefine their image. All these trends reinforce the impression and brand of the urban center as an attractive space for investment, keeping such nonsense meaningful. These four tendencies have been spreading correlatively impacting the centers and transforming them involving the displacement of former residents of these spaces and revitalizing the center that is financed and commercialized in parallel. The cases of Madrid and Barcelona as spaces of greater evidence in Spain of these tendencies serve to illustrate these processes and represent the spearhead. Useful recommendations are presented to urban planners to find the conciliation of communal and commercialized spaces.

Keywords: gentrification, shop trendiness, studentification, touristification

Procedia PDF Downloads 151
22706 Numerical Simulation of Fracturing Behaviour of Pre-Cracked Crystalline Rock Using a Cohesive Grain-Based Distinct Element Model

Authors: Mahdi Saadat, Abbas Taheri

Abstract:

Understanding the cracking response of crystalline rocks at mineralogical scale is of great importance during the design procedure of mining structures. A grain-based distinct element model (GBM) is employed to numerically study the cracking response of Barre granite at micro- and macro-scales. The GBM framework is augmented with a proposed distinct element-based cohesive model to reproduce the micro-cracking response of the inter- and intra-grain contacts. The cohesive GBM framework is implemented in PFC2D distinct element codes. The microstructural properties of Barre granite are imported in PFC2D to generate synthetic specimens. The microproperties of the model is calibrated against the laboratory uniaxial compressive and Brazilian split tensile tests. The calibrated model is then used to simulate the fracturing behaviour of pre-cracked Barre granite with different flaw configurations. The numerical results of the proposed model demonstrate a good agreement with the experimental counterparts. The GBM framework proposed thus appears promising for further investigation of the influence of grain microstructure and mineralogical properties on the cracking behaviour of crystalline rocks.

Keywords: discrete element modelling, cohesive grain-based model, crystalline rock, fracturing behavior

Procedia PDF Downloads 113