Search results for: spatial time series
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21501

Search results for: spatial time series

20181 Machine Installation and Maintenance Management

Authors: Mohammed Benmostefa

Abstract:

In the industrial production of large series or even medium series, there are vibration problems. In continuous operations, technical devices result in vibrations in solid bodies and machine components, which generate solid noise and/or airborne noise. This is because vibrations are the mechanical oscillations of an object near its equilibrium point. In response to the problems resulting from these vibrations, a number of remedial acts and solutions have been put forward. These include insulation of machines, insulation of concrete masses, insulation under screeds, insulation of sensitive equipment, point insulation of machines, linear insulation of machines, full surface insulation of machines, and the like. Following this, the researcher sought not only to raise awareness on the possibility of lowering the vibration frequency in industrial machines but also to stress the significance of procedures involving the pre-installation process of machinery, namely, setting appropriate installation and start-up methods of the machine, allocating and updating imprint folders to each machine, and scheduling maintenance of each machine all year round to have reliable equipment, gain cost reduction and maintenance efficiency to eventually ensure the overall economic performance of the company.

Keywords: maintenance, vibration, efficiency, production, machinery

Procedia PDF Downloads 85
20180 Modelling Fluidization by Data-Based Recurrence Computational Fluid Dynamics

Authors: Varun Dongre, Stefan Pirker, Stefan Heinrich

Abstract:

Over the last decades, the numerical modelling of fluidized bed processes has become feasible even for industrial processes. Commonly, continuous two-fluid models are applied to describe large-scale fluidization. In order to allow for coarse grids novel two-fluid models account for unresolved sub-grid heterogeneities. However, computational efforts remain high – in the order of several hours of compute-time for a few seconds of real-time – thus preventing the representation of long-term phenomena such as heating or particle conversion processes. In order to overcome this limitation, data-based recurrence computational fluid dynamics (rCFD) has been put forward in recent years. rCFD can be regarded as a data-based method that relies on the numerical predictions of a conventional short-term simulation. This data is stored in a database and then used by rCFD to efficiently time-extrapolate the flow behavior in high spatial resolution. This study will compare the numerical predictions of rCFD simulations with those of corresponding full CFD reference simulations for lab-scale and pilot-scale fluidized beds. In assessing the predictive capabilities of rCFD simulations, we focus on solid mixing and secondary gas holdup. We observed that predictions made by rCFD simulations are highly sensitive to numerical parameters such as diffusivity associated with face swaps. We achieved a computational speed-up of four orders of magnitude (10,000 time faster than classical TFM simulation) eventually allowing for real-time simulations of fluidized beds. In the next step, we apply the checkerboarding technique by introducing gas tracers subjected to convection and diffusion. We then analyze the concentration profiles by observing mixing, transport of gas tracers, insights about the convective and diffusive pattern of the gas tracers, and further towards heat and mass transfer methods. Finally, we run rCFD simulations and calibrate them with numerical and physical parameters compared with convectional Two-fluid model (full CFD) simulation. As a result, this study gives a clear indication of the applicability, predictive capabilities, and existing limitations of rCFD in the realm of fluidization modelling.

Keywords: multiphase flow, recurrence CFD, two-fluid model, industrial processes

Procedia PDF Downloads 73
20179 An Optimal Matching Design Method of Space-Based Optical Payload for Typical Aerial Target Detection

Authors: Yin Zhang, Kai Qiao, Xiyang Zhi, Jinnan Gong, Jianming Hu

Abstract:

In order to effectively detect aerial targets over long distances, an optimal matching design method of space-based optical payload is proposed. Firstly, main factors affecting optical detectability of small targets under complex environment are analyzed based on the full link of a detection system, including band center, band width and spatial resolution. Then a performance characterization model representing the relationship between image signal-to-noise ratio (SCR) and the above influencing factors is established to describe a detection system. Finally, an optimal matching design example is demonstrated for a typical aerial target by simulating and analyzing its SCR under different scene clutter coupling with multi-scale characteristics, and the optimized detection band and spatial resolution are presented. The method can provide theoretical basis and scientific guidance for space-based detection system design, payload specification demonstration and information processing algorithm optimization.

Keywords: space-based detection, aerial targets, optical system design, detectability characterization

Procedia PDF Downloads 167
20178 PhilSHORE: Development of a WebGIS-Based Marine Spatial Planning Tool for Tidal Current Energy Resource Assessment and Site Suitability Analysis

Authors: Ma. Rosario Concepcion O. Ang, Luis Caezar Ian K. Panganiban, Charmyne B. Mamador, Oliver Dan G. De Luna, Michael D. Bausas, Joselito P. Cruz

Abstract:

PhilSHORE is a multi-site, multi-device and multi-criteria decision support tool designed to support the development of tidal current energy in the Philippines. Its platform is based on Geographic Information Systems (GIS) which allows for the collection, storage, processing, analyses and display of geospatial data. Combining GIS tools with open source web development applications, PhilSHORE becomes a webGIS-based marine spatial planning tool. To date, PhilSHORE displays output maps and graphs of power and energy density, site suitability and site-device analysis. It enables stakeholders and the public easy access to the results of tidal current energy resource assessments and site suitability analyses. Results of the initial development shows PhilSHORE is a promising decision support tool for ORE project developments.

Keywords: gis, site suitability analysis, tidal current energy resource assessment, webgis

Procedia PDF Downloads 525
20177 Air Access Liberalisation and Tourism Trade Evidence from a Sids

Authors: Seetanah Boopen, R. V. Sannassee

Abstract:

The objective of the present study is two-fold. Firstly, to assess the impact of air access liberalization on tourism demand for Mauritius and secondly to analyses the dual impact of the interplay between air access liberalization and marketing promotion efforts on tourism demand. Using an Autoregressive Distributed Lag model, the results suggest that air access liberalization is an important ingredient, albeit to a lesser extent as compared to other classical explanatory variables, of tourism demand. The results also highlight the fact that Mauritius is perceived as a luxurious destination and tourists are deemed price sensitive. Moreover, our dynamic approach interestingly confirms the presence of repeat tourism in the island. Finally, the findings also uncover the positive impact of the interplay between air access liberalization and marketing promotion efforts on fostering tourism demand.

Keywords: air access liberalization, ARDL, SIDS, time series

Procedia PDF Downloads 308
20176 Fault Diagnosis of Nonlinear Systems Using Dynamic Neural Networks

Authors: E. Sobhani-Tehrani, K. Khorasani, N. Meskin

Abstract:

This paper presents a novel integrated hybrid approach for fault diagnosis (FD) of nonlinear systems. Unlike most FD techniques, the proposed solution simultaneously accomplishes fault detection, isolation, and identification (FDII) within a unified diagnostic module. At the core of this solution is a bank of adaptive neural parameter estimators (NPE) associated with a set of single-parameter fault models. The NPEs continuously estimate unknown fault parameters (FP) that are indicators of faults in the system. Two NPE structures including series-parallel and parallel are developed with their exclusive set of desirable attributes. The parallel scheme is extremely robust to measurement noise and possesses a simpler, yet more solid, fault isolation logic. On the contrary, the series-parallel scheme displays short FD delays and is robust to closed-loop system transients due to changes in control commands. Finally, a fault tolerant observer (FTO) is designed to extend the capability of the NPEs to systems with partial-state measurement.

Keywords: hybrid fault diagnosis, dynamic neural networks, nonlinear systems, fault tolerant observer

Procedia PDF Downloads 399
20175 Technical Assessment of Utilizing Electrical Variable Transmission Systems in Hybrid Electric Vehicles

Authors: Majid Vafaeipour, Mohamed El Baghdadi, Florian Verbelen, Peter Sergeant, Joeri Van Mierlo, Kurt Stockman, Omar Hegazy

Abstract:

The Electrical Variable Transmission (EVT), an electromechanical device, can be considered as an alternative solution to the conventional transmission system utilized in Hybrid Electric Vehicles (HEVs). This study present comparisons in terms of fuel consumption, power split, and state of charge (SoC) of an HEV containing an EVT to a conventional parallel topology and a series topology. To this end, corresponding simulations of these topologies are all performed in presence of control strategies enabling battery charge-sustaining and efficient power split. The power flow through the components of the vehicle are attained, and fuel consumption results of the considered cases are compared. The investigation of the results indicates utilizing EVT can provide significant added values in HEV configurations. The outcome of the current research paves its path for implementation of design optimization approaches on such systems in further research directions.

Keywords: Electrical Variable Transmission (EVT), Hybrid Electric Vehicle (HEV), parallel, series, modeling

Procedia PDF Downloads 236
20174 Mapping Forest Biodiversity Using Remote Sensing and Field Data in the National Park of Tlemcen (Algeria)

Authors: Bencherif Kada

Abstract:

In forest management practice, landscape and Mediterranean forest are never posed as linked objects. But sustainable forestry requires the valorization of the forest landscape and this aim involves assessing the spatial distribution of biodiversity by mapping forest landscaped units and subunits and by monitoring the environmental trends. This contribution aims to highlight, through object-oriented classifications, the landscaped biodiversity of the National Park of Tlemcen (Algeria). The methodology used is based on ground data and on the basic processing units of object-oriented classification that are segments, so-called image-objects, representing a relatively homogenous units on the ground. The classification of Landsat Enhanced Thematic Mapper plus (ETM+) imagery is performed on image objects, and not on pixels. Advantages of object-oriented classification are to make full use of meaningful statistic and texture calculation, uncorrelated shape information (e.g., length-to-width ratio, direction and area of an object, etc.) and topological features (neighbor, super-object, etc.), and the close relation between real-world objects and image objects. The results show that per object classification using the k-nearest neighbor’s method is more efficient than per pixel one. It permits to simplify the content of the image while preserving spectrally and spatially homogeneous types of land covers such as Aleppo pine stands, cork oak groves, mixed groves of cork oak, holm oak and zen oak, mixed groves of holm oak and thuja, water plan, dense and open shrub-lands of oaks, vegetable crops or orchard, herbaceous plants and bare soils. Texture attributes seem to provide no useful information while spatial attributes of shape, compactness seem to be performant for all the dominant features, such as pure stands of Aleppo pine and/or cork oak and bare soils. Landscaped sub-units are individualized while conserving the spatial information. Continuously dominant dense stands over a large area were formed into a single class, such as dense, fragmented stands with clear stands. Low shrublands formations and high wooded shrublands are well individualized but with some confusion with enclaves for the former. Overall, a visual evaluation of the classification shows that the classification reflects the actual spatial state of the study area at the landscape level.

Keywords: forest, oaks, remote sensing, biodiversity, shrublands

Procedia PDF Downloads 30
20173 From Servicescape to Servicespace: Qualitative Research in a Post-Cartesian Retail Context

Authors: Chris Houliez

Abstract:

This study addresses the complex dynamics of the modern retail environment, focusing on how the ubiquitous nature of mobile communication technologies has reshaped the shopper experience and tested the limits of the conventional "servicescape" concept commonly used to describe retail experiences. The objective is to redefine the conceptualization of retail space by introducing an approach to space that aligns with a retail context where physical and digital interactions are increasingly intertwined. To offer a more shopper-centric understanding of the retail experience, this study draws from phenomenology, particularly Henri Lefebvre’s work on the production of space. The presented protocol differs from traditional methodologies by not making assumptions about what constitutes a retail space. Instead, it adopts a perspective based on Lefebvre’s seminal work, which posits that space is not a three-dimensional container commonly referred to as “servicescape” but is actively produced through shoppers’ spatial practices. This approach allows for an in-depth exploration of the retail experience by capturing the everyday spatial practices of shoppers without preconceived notions of what constitutes a retail space. The designed protocol was tested with eight participants during 209 hours of day-long field trips, immersing the researcher into the shopper's lived experience by combining multiple data collection methods, including participant observation, videography, photography, and both pre-fieldwork and post-fieldwork interviews. By giving equal importance to both locations and connections, this study unpacked various spatial practices that contribute to the production of retail space. These findings highlight the relative inadequacy of some traditional retail space conceptualizations, which often fail to capture the fluid nature of contemporary shopping experiences. The study's emphasis on the customization process, through which shoppers optimize their retail experience by producing a “fully lived retail space,” offers a more comprehensive understanding of consumer shopping behavior in the digital age. In conclusion, this research presents a significant shift in the conceptualization of retail space. By employing a phenomenological approach rooted in Lefebvre’s theory, the study provides a more efficient framework to understand the retail experience in the age of mobile communication technologies. Although this research is limited by its small sample size and the demographic profile of participants, it offers valuable insights into the spatial practices of modern shoppers and their implications for retail researchers and retailers alike.

Keywords: shopper behavior, mobile telecommunication technologies, qualitative research, servicescape, servicespace

Procedia PDF Downloads 20
20172 Hidden Hot Spots: Identifying and Understanding the Spatial Distribution of Crime

Authors: Lauren C. Porter, Andrew Curtis, Eric Jefferis, Susanne Mitchell

Abstract:

A wealth of research has been generated examining the variation in crime across neighborhoods. However, there is also a striking degree of crime concentration within neighborhoods. A number of studies show that a small percentage of street segments, intersections, or addresses account for a large portion of crime. Not surprisingly, a focus on these crime hot spots can be an effective strategy for reducing community level crime and related ills, such as health problems. However, research is also limited in an important respect. Studies tend to use official data to identify hot spots, such as 911 calls or calls for service. While the use of call data may be more representative of the actual level and distribution of crime than some other official measures (e.g. arrest data), call data still suffer from the 'dark figure of crime.' That is, there is most certainly a degree of error between crimes that occur versus crimes that are reported to the police. In this study, we present an alternative method of identifying crime hot spots, that does not rely on official data. In doing so, we highlight the potential utility of neighborhood-insiders to identify and understand crime dynamics within geographic spaces. Specifically, we use spatial video and geo-narratives to record the crime insights of 36 police, ex-offenders, and residents of a high crime neighborhood in northeast Ohio. Spatial mentions of crime are mapped to identify participant-identified hot spots, and these are juxtaposed with calls for service (CFS) data. While there are bound to be differences between these two sources of data, we find that one location, in particular, a corner store, emerges as a hot spot for all three groups of participants. Yet it does not emerge when we examine CFS data. A closer examination of the space around this corner store and a qualitative analysis of narrative data reveal important clues as to why this store may indeed be a hot spot, but not generate disproportionate calls to the police. In short, our results suggest that researchers who rely solely on official data to study crime hot spots may risk missing some of the most dangerous places.

Keywords: crime, narrative, video, neighborhood

Procedia PDF Downloads 236
20171 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation

Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner

Abstract:

This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.

Keywords: business valuation, corporate finance, digitisation, disruption

Procedia PDF Downloads 132
20170 Density Based Traffic System Using Pic Microcontroller

Authors: Tatipamula Samiksha Goud, .A.Naveena, M.sresta

Abstract:

Traffic congestion is a major issue in many cities throughout the world, particularly in urban areas, and it is past time to switch from a fixed timer mode to an automated system. The current traffic signalling system is a fixed-time system that is inefficient if one lane is more functional than the others. A structure for an intelligent traffic control system is being designed to address this issue. When traffic density is higher on one side of a junction, the signal's green time is extended in comparison to the regular time. This study suggests a technique in which the signal's time duration is assigned based on the amount of traffic present at the time. Infrared sensors can be used to do this.

Keywords: infrared sensors, micro-controllers, LEDs, oscillators

Procedia PDF Downloads 140
20169 Improving Predictions of Coastal Benthic Invertebrate Occurrence and Density Using a Multi-Scalar Approach

Authors: Stephanie Watson, Fabrice Stephenson, Conrad Pilditch, Carolyn Lundquist

Abstract:

Spatial data detailing both the distribution and density of functionally important marine species are needed to inform management decisions. Species distribution models (SDMs) have proven helpful in this regard; however, models often focus only on species occurrences derived from spatially expansive datasets and lack the resolution and detail required to inform regional management decisions. Boosted regression trees (BRT) were used to produce high-resolution SDMs (250 m) at two spatial scales predicting probability of occurrence, abundance (count per sample unit), density (count per km2) and uncertainty for seven coastal seafloor taxa that vary in habitat usage and distribution to examine prediction differences and implications for coastal management. We investigated if small scale regionally focussed models (82,000 km2) can provide improved predictions compared to data-rich national scale models (4.2 million km2). We explored the variability in predictions across model type (occurrence vs abundance) and model scale to determine if specific taxa models or model types are more robust to geographical variability. National scale occurrence models correlated well with broad-scale environmental predictors, resulting in higher AUC (Area under the receiver operating curve) and deviance explained scores; however, they tended to overpredict in the coastal environment and lacked spatially differentiated detail for some taxa. Regional models had lower overall performance, but for some taxa, spatial predictions were more differentiated at a localised ecological scale. National density models were often spatially refined and highlighted areas of ecological relevance producing more useful outputs than regional-scale models. The utility of a two-scale approach aids the selection of the most optimal combination of models to create a spatially informative density model, as results contrasted for specific taxa between model type and scale. However, it is vital that robust predictions of occurrence and abundance are generated as inputs for the combined density model as areas that do not spatially align between models can be discarded. This study demonstrates the variability in SDM outputs created over different geographical scales and highlights implications and opportunities for managers utilising these tools for regional conservation, particularly in data-limited environments.

Keywords: Benthic ecology, spatial modelling, multi-scalar modelling, marine conservation.

Procedia PDF Downloads 75
20168 Optimization of Water Pipeline Routes Using a GIS-Based Multi-Criteria Decision Analysis and a Geometric Search Algorithm

Authors: Leon Mortari

Abstract:

The Metropolitan East region of Rio de Janeiro state, Brazil, faces a historic water scarcity. Among the alternatives studied to solve this situation, the possibility of adduction of the available water in the reservoir Lagoa de Juturnaíba to supply the region's municipalities stands out. The allocation of a linear engineering project must occur through an evaluation of different aspects, such as altitude, slope, proximity to roads, distance from watercourses, land use and occupation, and physical and chemical features of the soil. This work aims to apply a multi-criteria model that combines geoprocessing techniques, decision-making, and geometric search algorithm to optimize a hypothetical adductor system in the scenario of expanding the water supply system that serves this region, known as Imunana-Laranjal, using the Lagoa de Juturnaíba as the source. It is proposed in this study, the construction of a spatial database related to the presented evaluation criteria, treatment and rasterization of these data, and standardization and reclassification of this information in a Geographic Information System (GIS) platform. The methodology involves the integrated analysis of these criteria, using their relative importance defined by weighting them based on expert consultations and the Analytic Hierarchy Process (AHP) method. Three approaches are defined for weighting the criteria by AHP: the first treats all criteria as equally important, the second considers weighting based on a pairwise comparison matrix, and the third establishes a hierarchy based on the priority of the criteria. For each approach, a distinct group of weightings is defined. In the next step, map algebra tools are used to overlay the layers and generate cost surfaces, that indicates the resistance to the passage of the adductor route, using the three groups of weightings. The Dijkstra algorithm, a geometric search algorithm, is then applied to these cost surfaces to find an optimized path within the geographical space, aiming to minimize resources, time, investment, maintenance, and environmental and social impacts.

Keywords: geometric search algorithm, GIS, pipeline, route optimization, spatial multi-criteria analysis model

Procedia PDF Downloads 30
20167 Incorporating Spatial Selection Criteria with Decision-Maker Preferences of A Precast Manufacturing Plant

Authors: M. N. A. Azman, M. S. S. Ahamad

Abstract:

The Construction Industry Development Board of Malaysia has been actively promoting the use of precast manufacturing in the local construction industry over the last decade. In an era of rapid technological changes, precast manufacturing significantly contributes to improving construction activities and ensuring sustainable economic growth. Current studies on the location decision of precast manufacturing plants aimed to enhanced local economic development are scarce. To address this gap, the present research establishes a new set of spatial criteria, such as attribute maps and preference weights, derived from a survey of local industry decision makers. These data represent the input parameters for the MCE-GIS site selection model, for which the weighted linear combination method is used. Verification tests on the model were conducted to determine the potential precast manufacturing sites in the state of Penang, Malaysia. The tests yield a predicted area of 12.87 acres located within a designated industrial zone. Although, the model is developed specifically for precast manufacturing plant but nevertheless it can be employed to other types of industries by following the methodology and guidelines proposed in the present research.

Keywords: geographical information system, multi criteria evaluation, industrialised building system, civil engineering

Procedia PDF Downloads 286
20166 Spatial and Temporal Evaluations of Disinfection By-Products Formation in Coastal City Distribution Systems of Turkey

Authors: Vedat Uyak

Abstract:

Seasonal variations of trihalomethanes (THMs) and haloacetic acids (HAAs) concentrations were investigated within three distribution systems of a coastal city of Istanbul, Turkey. Moreover, total trihalomethanes and other organics concentration were also analyzed. The investigation was based on an intensive 16 month (2009-2010) sampling program, undertaken during the spring, summer, fall and winter seasons. Four THM (chloroform, dichlorobromomethane, chlorodibromomethane, bromoform), and nine HAA (the most commonly occurring one being dichloroacetic acid (DCAA) and trichloroacetic acid (TCAA); other compounds are monochloroacetic acid (MCAA), monobromoacetic acid (MBAA), dibromoacetic acid (DBAA), tribromoacetic acid (TBAA), bromochloroacetic acid (BCAA), bromodichloroacetic acid (BDCAA) and chlorodibromoacetic acid (CDBAA)) species and other water quality and operational parameters were monitored at points along the distribution system between the treatment plant and the system’s extremity. The effects of coastal water sources, seasonal variation and spatial variation were examined. The results showed that THMs and HAAs concentrations vary significantly between treated waters and water at the distribution networks. When water temperature exceeds 26°C in summer, the THMs and HAAs levels are 0.8 – 1.1, and 0.4 – 0.9 times higher than treated water, respectively. While when water temperature is below 12°C in the winter, the measured THMs and HAAs concentrations at the system’s extremity were very rarely higher than 100 μg/L, and 60 μg/L, respectively. The highest THM concentrations occurred in the Buyukcekmece distribution system, with an average total HAA concentration of 92 μg/L. Moreover, the lowest THM levels were observed in the Omerli distribution network, with a mean concentration of 7 μg/L. For HAA levels, the maximum concentrations again were observed in the Buyukcekmece distribution system, with an average total HAA concentration of 57 μg/l. High spatial and seasonal variation of disinfection by-products in the drinking water of Istanbul was attributed of illegal wastewater discharges to water supplies of Istanbul city.

Keywords: disinfection byproducts, drinking water, trihalomethanes, haloacetic acids, seasonal variation

Procedia PDF Downloads 150
20165 Evalution of the Impact on Improvement of Bank Manager Decision Making

Authors: Farzane Sadatnia, Bahram Fathi

Abstract:

Today, all public and private organizations have found that the management of the world for key information related to the activities of a staff and its main essence and philosophy, though they constitute the management information systems are very helpful in this respect the right to apply systems can save a lot in terms of economic organizations including reducing the time decision - making, improve the quality of decision making, and cost savings to bring information systems is a backup system that can never be instead of logic and human reasoning, which can be used in the series is spreading, providing resources, and provide the necessary facilities, provide better services for users, balanced budget allocation, determine strengths and weaknesses and previous plans to review the current decisions and especially the decision . Hence; in this study attempts to the effect of an information system on a review of the organization.

Keywords: information system, planning, organization, coordination, control

Procedia PDF Downloads 474
20164 The Temperature Degradation Process of Siloxane Polymeric Coatings

Authors: Andrzej Szewczak

Abstract:

Study of the effect of high temperatures on polymer coatings represents an important field of research of their properties. Polymers, as materials with numerous features (chemical resistance, ease of processing and recycling, corrosion resistance, low density and weight) are currently the most widely used modern building materials, among others in the resin concrete, plastic parts, and hydrophobic coatings. Unfortunately, the polymers have also disadvantages, one of which decides about their usage - low resistance to high temperatures and brittleness. This applies in particular thin and flexible polymeric coatings applied to other materials, such a steel and concrete, which degrade under varying thermal conditions. Research about improvement of this state includes methods of modification of the polymer composition, structure, conditioning conditions, and the polymerization reaction. At present, ways are sought to reflect the actual environmental conditions, in which the coating will be operating after it has been applied to other material. These studies are difficult because of the need for adopting a proper model of the polymer operation and the determination of phenomena occurring at the time of temperature fluctuations. For this reason, alternative methods are being developed, taking into account the rapid modeling and the simulation of the actual operating conditions of polymeric coating’s materials in real conditions. The nature of a duration is typical for the temperature influence in the environment. Studies typically involve the measurement of variation one or more physical and mechanical properties of such coating in time. Based on these results it is possible to determine the effects of temperature loading and develop methods affecting in the improvement of coatings’ properties. This paper contains a description of the stability studies of silicone coatings deposited on the surface of a ceramic brick. The brick’s surface was hydrophobized by two types of inorganic polymers: nano-polymer preparation based on dialkyl siloxanes (Series 1 - 5) and an aqueous solution of the silicon (series 6 - 10). In order to enhance the stability of the film formed on the brick’s surface and immunize it to variable temperature and humidity loading, the nano silica was added to the polymer. The right combination of the polymer liquid phase and the solid phase of nano silica was obtained by disintegration of the mixture by the sonification. The changes of viscosity and surface tension of polymers were defined, which are the basic rheological parameters affecting the state and the durability of the polymer coating. The coatings created on the brick’s surfaces were then subjected to a temperature loading of 100° C and moisture by total immersion in water, in order to determine any water absorption changes caused by damages and the degradation of the polymer film. The effect of moisture and temperature was determined by measurement (at specified number of cycles) of changes in the surface hardness (using a Vickers’ method) and the absorption of individual samples. As a result, on the basis of the obtained results, the degradation process of polymer coatings related to their durability changes in time was determined.

Keywords: silicones, siloxanes, surface hardness, temperature, water absorption

Procedia PDF Downloads 242
20163 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks

Authors: Van Trieu, Shouhuai Xu, Yusheng Feng

Abstract:

Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.

Keywords: causality, multilevel graph, cyber-attacks, prediction

Procedia PDF Downloads 156
20162 Evaluation of the Spatial Regulation of Hydrogen Sulphide Producing Enzymes in the Placenta during Labour

Authors: F. Saleh, F. Lyall, A. Abdulsid, L. Marks

Abstract:

Background: Labour in human is a complex biological process that involves interactions of neurological, hormonal and inflammatory pathways, with the placenta being a key regulator of these pathways. It is known that uterine contractions and labour pain cause physiological changes in gene expression in maternal and fetal blood, and in placenta during labour. Oxidative and inflammatory stress pathways are implicated in labour and they may cause alteration of placental gene expression. Additionally, in placental tissues, labour increases the expression of genes involved in placental oxidative stress, inflammatory cytokines, angiogenic regulators and apoptosis. Recently, Hydrogen Sulphide (H2S) has been considered as an endogenous gaseous mediator which promotes vasodilation and exhibits cytoprotective anti-inflammatory properties. The endogenous H2S is synthesised predominantly by two enzymes: cystathionine β-synthase (CBS) and cystathionine γ-lyase (CSE). As the H2S pathway has anti-oxidative and anti-inflammatory characteristics thus, we hypothesised that the expression of CBS and CSE in placental tissues would alter during labour. Methods: CBS and CSE expressions were examined in placentas using western blotting and RT-PCR in inner, middle and outer placental zones in placentas obtained from healthy non labouring women who delivered by caesarian section. These were compared with the equivalent zone of placentas obtained from women who had uncomplicated labour and delivered vaginally. Results: No differences in CBS and CSE mRNA or protein levels were found between the different sites within placentas in either the labour or non-labour group. There were no significant differences in either CBS or CSE expression between the two groups at the inner site and middle site. However, at the outer site there was a highly significant decrease in CBS protein expression in the labour group when compared to the non-labour group (p = 0.002). Conclusion: To the best of author’s knowledge, this is the first report to suggest that, CBS is expressed in a spatial manner within the human placenta. Further work is needed to clarify the precise function and mechanism of this spatial regulation although it is likely that inflammatory pathways regulation is a complex process in which this plays a role.

Keywords: anti-inflammatory, hydrogen sulphide, labour, oxidative stress

Procedia PDF Downloads 240
20161 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution

Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani Alghamdi

Abstract:

Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.

Keywords: binary segmentation, change point, exponentialLomax distribution, information criterion

Procedia PDF Downloads 172
20160 Reconstructed Phase Space Features for Estimating Post Traumatic Stress Disorder

Authors: Andre Wittenborn, Jarek Krajewski

Abstract:

Trauma-related sadness in speech can alter the voice in several ways. The generation of non-linear aerodynamic phenomena within the vocal tract is crucial when analyzing trauma-influenced speech production. They include non-laminar flow and formation of jets rather than well-behaved laminar flow aspects. Especially state-space reconstruction methods based on chaotic dynamics and fractal theory have been suggested to describe these aerodynamic turbulence-related phenomena of the speech production system. To extract the non-linear properties of the speech signal, we used the time delay embedding method to reconstruct from a scalar time series (reconstructed phase space, RPS). This approach results in the extraction of 7238 Features per .wav file (N= 47, 32 m, 15 f). The speech material was prompted by telling about autobiographical related sadness-inducing experiences (sampling rate 16 kHz, 8-bit resolution). After combining these features in a support vector machine based machine learning approach (leave-one-sample out validation), we achieved a correlation of r = .41 with the well-established, self-report ground truth measure (RATS) of post-traumatic stress disorder (PTSD).

Keywords: non-linear dynamics features, post traumatic stress disorder, reconstructed phase space, support vector machine

Procedia PDF Downloads 102
20159 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components

Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura

Abstract:

This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.

Keywords: brain-computer interface, electroencephalography, finger motion decoding, independent component analysis, pseudo real-time motion decoding

Procedia PDF Downloads 137
20158 Shiva's Dance: Crisis, Local Institutions, and Private Firms

Authors: João Pereira Dos Santos

Abstract:

The uneven spatial distribution of start-ups and their respective survival may reflect comparative advantages resulting from the local institutional background. For the first time, we explore this idea using Data Envelopment Analysis (DEA) to assess relative efficiency of Portuguese municipalities in this specific context. We depart from the related literature where expenditure is perceived as a desirable input by choosing a measure of fiscal responsibility and infrastructural variables in the first stage. Comparing results for 2006 and 2010, we find that mean performance decreased substantially with 1) the effects of the Global Financial Crisis; 2) as municipal population increases and 3) as financial independence decreases. A second stage is then computed employing a double-bootstrap procedure to evaluate how the regional context outside the control of local authorities (e.g. demographic characteristics and political preferences) impacts on efficiency.

Keywords: entrepreneurship, political economy, public finance, accountability, crisis, efficiency, Portuguese municipalities

Procedia PDF Downloads 501
20157 When Mobile Work Creates More Discrimination

Authors: Marie-Therese Claes, Anett Hermann

Abstract:

With the advent of the web and information technology since the end of the 20ᵗʰ century, digitalization has revolutionized our everyday life, from shopping and dating to education and transportation. The world of work is one of the areas that has been highly transformed by changing the time and spatial limits of the work. The expansion of the internet, wireless, and easily portable devices such as laptop computers and mobile phones has enabled us to work almost from any place at any time. As a result, telework, which started in the 1950s and elevated in the 1970s, steeply raised to a new level in 21ˢᵗ century. Telework consists of various forms of work done from outside the traditional workplace by using information technologies. The social distancing and lockdown measures that have been taken to reduce the spread of the virus in many countries worldwide resulted in an increasing number of teleworkers and made “working from home’’ synonymous with telework. Post-COVID-19, the number of teleworkers is still higher than before the pandemic period, and the interest in expanding teleworking has been growing too. Notwithstanding the advantages ushered by telework, it also has a number of drawbacks that negatively affect organizations and employees. The intention of this piece of work is not to indicate a causational relationship between telework and discrimination. Our aim is to indicate some unintended and/or unnoticed deleterious effects of telework in reinforcing discrimination and to instigate discussion on how to mitigate the effects. To do so, this insight indicates how telework reinforces traditional gender roles and how organizational culture towards telework and its access to employees at different levels of the organizational hierarchy opens the room for discrimination.

Keywords: mobile work, discrimination, gender roles, organizational culture

Procedia PDF Downloads 65
20156 Accessibility and Visibility through Space Syntax Analysis of the Linga Raj Temple in Odisha, India

Authors: S. Pramanik

Abstract:

Since the early ages, the Hindu temples have been interpreted through various Vedic philosophies. These temples are visited by pilgrims which demonstrate the rituals and religious belief of communities, reflecting a variety of actions and behaviors. Darsana a direct seeing, is a part of the pilgrimage activity. During the process of Darsana, a devotee is prepared for entry in the temple to realize the cognizing Truth culminating in visualizing the idol of God, placed at the Garbhagriha (sanctum sanctorum). For this, the pilgrim must pass through a sequential arrangement of spaces. During the process of progress, the pilgrims visualize the spaces differently from various points of views. The viewpoints create a variety of spatial patterns in the minds of pilgrims coherent to the Hindu philosophies. The space organization and its order are perceived by various techniques of spatial analysis. A temple, as examples of Kalinga stylistic variations, has been chosen for the study. This paper intends to demonstrate some visual patterns generated during the process of Darsana (visibility) and its accessibility by Point Isovist Studies and Visibility Graph Analysis from the entrance (Simha Dwara) to The Sanctum sanctorum (Garbhagriha).

Keywords: Hindu temple architecture, point isovist, space syntax analysis, visibility graph analysis

Procedia PDF Downloads 120
20155 Maxillofacial Trauma: A Case of Diacapitular Condylar Fracture

Authors: Krishna Prasad Regmi, Jun-Bo Tu, Cheng-Qun Hou, Li-Feng Li

Abstract:

Maxillofacial trauma in a pediatric group of patients is particularly challenging, as these patients have significant differences from adults as far as the facial skeleton is concerned. Mandibular condylar fractures are common presentations to hospitals across the globe and remain the most important cause of temporomandibular joint (TMJ) ankylosis. The etiology and epidemiology of pediatric trauma involving the diacapitular condylar fractures (DFs) have been reported in a large series of patients. Nevertheless, little is known about treatment protocols for DFs in children. Accordingly, the treatment modalities for the management of pediatric fractures also differ. We suggest following the PDA and intracapsular ABC classification of condylar fracture to increase the overall postoperative satisfaction level that bypasses the change of subjective feelings of patients’ from preoperative to the postoperative condition. At the same time, use of 3-D technology and surgical navigation may also increase treatment accuracy.

Keywords: maxillofacial trauma, diacapitular fracture, condylar fracture, PDA classification

Procedia PDF Downloads 268
20154 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data

Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.

Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine

Procedia PDF Downloads 240
20153 On the Impact of Oil Price Fluctuations on Stock Markets: A Multivariate Long-Memory GARCH Framework

Authors: Manel Youssef, Lotfi Belkacem

Abstract:

This paper employs multivariate long memory GARCH models to simultaneously estimate mean and conditional variance spillover effects between oil prices and different financial markets. Since different financial assets are traded based on these market sector returns, it’s important for financial market participants to understand the volatility transmission mechanism over time and across these series in order to make optimal portfolio allocation decisions. We examine weekly returns from January 1, 2003 to November 30, 2012 and find evidence of significant transmission of shocks and volatilities between oil prices and some of the examined financial markets. The findings support the idea of cross-market hedging and sharing of common information by investors.

Keywords: oil prices, stock indices returns, oil volatility, contagion, DCC-multivariate (FI) GARCH

Procedia PDF Downloads 530
20152 A Biomimetic Approach for the Multi-Objective Optimization of Kinetic Façade Design

Authors: Do-Jin Jang, Sung-Ah Kim

Abstract:

A kinetic façade responds to user requirements and environmental conditions.  In designing a kinetic façade, kinetic patterns play a key role in determining its performance. This paper proposes a biomimetic method for the multi-objective optimization for kinetic façade design. The autonomous decentralized control system is combined with flocking algorithm. The flocking agents are autonomously reacting to sensor values and bring about kinetic patterns changing over time. A series of experiments were conducted to verify the potential and limitations of the flocking based decentralized control. As a result, it could show the highest performance balancing multiple objectives such as solar radiation and openness among the comparison group.

Keywords: biomimicry, flocking algorithm, autonomous decentralized control, multi-objective optimization

Procedia PDF Downloads 514