Search results for: spatial information network
11925 Humanising Hospital Retrofitting: The Case Study of Malaysia Public Hospitals
Authors: Nur Faridatull Syafinaz Ahmad Tajudin
Abstract:
A hospital is a setting where individuals who are ill or injured are treated and cared for by doctors and nurses. Sanatoriums are settings where people can receive treatment and rest, particularly when recovering from a protracted illness. According to the report, hospitals are primarily designed to meet the needs of medical personnel by maximising their functionality and workflow. Hospitals frequently do a poor job of determining the patients' physical and emotional requirements and expectations. The literature on hospital design has recently focused more on the seeming need to "humanise" medical facilities. Despite the popularity of this design objective, "humanising" a space has hardly ever been defined or critically examined. The term "humanistic design" covered a broad range of design elements and designer interpretations. In reality, the hospital's layout and design the hospital may have a massive effect on patients' feel experience things and heal.Keywords: hospital retrofitting, hospital design, humanising hospital, spatial design
Procedia PDF Downloads 12011924 Image Multi-Feature Analysis by Principal Component Analysis for Visual Surface Roughness Measurement
Authors: Wei Zhang, Yan He, Yan Wang, Yufeng Li, Chuanpeng Hao
Abstract:
Surface roughness is an important index for evaluating surface quality, needs to be accurately measured to ensure the performance of the workpiece. The roughness measurement based on machine vision involves various image features, some of which are redundant. These redundant features affect the accuracy and speed of the visual approach. Previous research used correlation analysis methods to select the appropriate features. However, this feature analysis is independent and cannot fully utilize the information of data. Besides, blindly reducing features lose a lot of useful information, resulting in unreliable results. Therefore, the focus of this paper is on providing a redundant feature removal approach for visual roughness measurement. In this paper, the statistical methods and gray-level co-occurrence matrix(GLCM) are employed to extract the texture features of machined images effectively. Then, the principal component analysis(PCA) is used to fuse all extracted features into a new one, which reduces the feature dimension and maintains the integrity of the original information. Finally, the relationship between new features and roughness is established by the support vector machine(SVM). The experimental results show that the approach can effectively solve multi-feature information redundancy of machined surface images and provides a new idea for the visual evaluation of surface roughness.Keywords: feature analysis, machine vision, PCA, surface roughness, SVM
Procedia PDF Downloads 21211923 Object Recognition System Operating from Different Type Vehicles Using Raspberry and OpenCV
Authors: Maria Pavlova
Abstract:
In our days, it is possible to put the camera on different vehicles like quadcopter, train, airplane and etc. The camera also can be the input sensor in many different systems. That means the object recognition like non separate part of monitoring control can be key part of the most intelligent systems. The aim of this paper is to focus of the object recognition process during vehicles movement. During the vehicle’s movement the camera takes pictures from the environment without storage in Data Base. In case the camera detects a special object (for example human or animal), the system saves the picture and sends it to the work station in real time. This functionality will be very useful in emergency or security situations where is necessary to find a specific object. In another application, the camera can be mounted on crossroad where do not have many people and if one or more persons come on the road, the traffic lights became the green and they can cross the road. In this papers is presented the system has solved the aforementioned problems. It is presented architecture of the object recognition system includes the camera, Raspberry platform, GPS system, neural network, software and Data Base. The camera in the system takes the pictures. The object recognition is done in real time using the OpenCV library and Raspberry microcontroller. An additional feature of this library is the ability to display the GPS coordinates of the captured objects position. The results from this processes will be sent to remote station. So, in this case, we can know the location of the specific object. By neural network, we can learn the module to solve the problems using incoming data and to be part in bigger intelligent system. The present paper focuses on the design and integration of the image recognition like a part of smart systems.Keywords: camera, object recognition, OpenCV, Raspberry
Procedia PDF Downloads 21811922 A Taxonomy of the Informational Content of Virtual Heritage Serious Games
Authors: Laurence C. Hanes, Robert J. Stone
Abstract:
Video games have reached a point of huge commercial success as well as wide familiarity with audiences both young and old. Much attention and research have also been directed towards serious games and their potential learning affordances. It is little surprise that the field of virtual heritage has taken a keen interest in using serious games to present cultural heritage information to users, with applications ranging from museums and cultural heritage institutions, to academia and research, to schools and education. Many researchers have already documented their efforts to develop and distribute virtual heritage serious games. Although attempts have been made to create classifications of the different types of virtual heritage games (somewhat akin to the idea of game genres), no formal taxonomy has yet been produced to define the different types of cultural heritage and historical information that can be presented through these games at a content level, and how the information can be manifested within the game. This study proposes such a taxonomy. First the informational content is categorized as heritage or historical, then further divided into tangible, intangible, natural, and analytical. Next, the characteristics of the manifestation within the game are covered. The means of manifestation, level of demonstration, tone, and focus are all defined and explained. Finally, the potential learning outcomes of the content are discussed. A demonstration of the taxonomy is then given by describing the informational content and corresponding manifestations within several examples of virtual heritage serious games as well as commercial games. It is anticipated that this taxonomy will help designers of virtual heritage serious games to think about and clearly define the information they are presenting through their games, and how they are presenting it. Another result of the taxonomy is that it will enable us to frame cultural heritage and historical information presented in commercial games with a critical lens, especially where there may not be explicit learning objectives. Finally, the results will also enable us to identify shared informational content and learning objectives between any virtual heritage serious and/or commercial games.Keywords: informational content, serious games, taxonomy, virtual heritage
Procedia PDF Downloads 36711921 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm
Authors: Muhammad Bilal, Zhongfeng Qiu
Abstract:
Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.Keywords: AEORNET, AOD, SARA, GOCI, Beijing
Procedia PDF Downloads 17111920 Methods of Variance Estimation in Two-Phase Sampling
Authors: Raghunath Arnab
Abstract:
The two-phase sampling which is also known as double sampling was introduced in 1938. In two-phase sampling, samples are selected in phases. In the first phase, a relatively large sample of size is selected by some suitable sampling design and only information on the auxiliary variable is collected. During the second phase, a sample of size is selected either from, the sample selected in the first phase or from the entire population by using a suitable sampling design and information regarding the study and auxiliary variable is collected. Evidently, two phase sampling is useful if the auxiliary information is relatively easy and cheaper to collect than the study variable as well as if the strength of the relationship between the variables and is high. If the sample is selected in more than two phases, the resulting sampling design is called a multi-phase sampling. In this article we will consider how one can use data collected at the first phase sampling at the stages of estimation of the parameter, stratification, selection of sample and their combinations in the second phase in a unified setup applicable to any sampling design and wider classes of estimators. The problem of the estimation of variance will also be considered. The variance of estimator is essential for estimating precision of the survey estimates, calculation of confidence intervals, determination of the optimal sample sizes and for testing of hypotheses amongst others. Although, the variance is a non-negative quantity but its estimators may not be non-negative. If the estimator of variance is negative, then it cannot be used for estimation of confidence intervals, testing of hypothesis or measure of sampling error. The non-negativity properties of the variance estimators will also be studied in details.Keywords: auxiliary information, two-phase sampling, varying probability sampling, unbiased estimators
Procedia PDF Downloads 58811919 Requirements for a Shared Management of State-Owned Building in the Archaeological Park of Pompeii
Authors: Maria Giovanna Pacifico
Abstract:
Maintenance, in Italy, is not yet a consolidated practice despite the benefits that could come from. Among the main reasons, there are the lack of financial resources and personnel in the public administration and a general lack of knowledge about how to activate and to manage a prevented and programmed maintenance. The experimentation suggests that users and tourists could be involved in the maintenance process from the knowledge phase to the monitoring ones by using mobile devices. The goal is to increase the quality of Facility Management for cultural heritage, prioritizing usage needs, and limiting interference between the key stakeholders. The method simplifies the consolidated procedures for the Information Systems, avoiding a loss in terms of quality and amount of information by focusing on the users' requirements: management economy, user safety, accessibility, and by receiving feedback information to define a framework that will lead to predictive maintenance. This proposal was designed to be tested in the Archaeological Park of Pompeii on the state property asset.Keywords: asset maintenance, key stakeholders, Pompeii, user requirement
Procedia PDF Downloads 12511918 Comparative Fragility Analysis of Shallow Tunnels Subjected to Seismic and Blast Loads
Authors: Siti Khadijah Che Osmi, Mohammed Ahmad Syed
Abstract:
Underground structures are crucial components which required detailed analysis and design. Tunnels, for instance, are massively constructed as transportation infrastructures and utilities network especially in urban environments. Considering their prime importance to the economy and public safety that cannot be compromised, thus any instability to these tunnels will be highly detrimental to their performance. Recent experience suggests that tunnels become vulnerable during earthquakes and blast scenarios. However, a very limited amount of studies has been carried out to study and understanding the dynamic response and performance of underground tunnels under those unpredictable extreme hazards. In view of the importance of enhancing the resilience of these structures, the overall aims of the study are to evaluate probabilistic future performance of shallow tunnels subjected to seismic and blast loads by developing detailed fragility analysis. Critical non-linear time history numerical analyses using sophisticated finite element software Midas GTS NX have been presented about the current methods of analysis, taking into consideration of structural typology, ground motion and explosive characteristics, effect of soil conditions and other associated uncertainties on the tunnel integrity which may ultimately lead to the catastrophic failure of the structures. The proposed fragility curves for both extreme loadings are discussed and compared which provide significant information the performance of the tunnel under extreme hazards which may beneficial for future risk assessment and loss estimation.Keywords: fragility analysis, seismic loads, shallow tunnels, blast loads
Procedia PDF Downloads 34311917 The Voluntary Audit of Semi-Annual Consolidated Financial Statements Decision and Accounting Conservatism
Authors: Shuofen Hsu, Ya-Yi Chao, Chao-Wei Li
Abstract:
This paper investigates the relationship between voluntary audit (hereafter, VA) of semi-annual consolidated financial statements decision and accounting conservatism. In general, there are four kinds of auditors' assurance services, which include audit, review, agreed-upon procedure and compliance engagements base on degree of assurance. The VA work by auditors may not only have the higher audit quality but an important signal of more reliable information than the review work. In Taiwan, The listed companies must prepare the semi-annual consolidated financial statements and with auditors' review before 2012, but some of the listed companies choose the assurance work from review to audit voluntarily. Due to the adoption of International Financial Reporting Standards, the listed companies were required to prepare the second quarterly consolidated financial statements which should be reviewed by auditors since 2013. This rule will change some of the assurance work from audit to review by auditors, and the information asymmetry maybe increased. To control the selection bias, we use two-stage model to test the relationship between VA decision and accounting conservatism. Our empirical results indicate that the VA decision and accounting conservatism have a significant positive relationship in firms with family-controlled. That is, firms with family-controlled are more likely to do VA and to prepare more conservative consolidated financial statements to reduce the information asymmetry, meaning that there is a complementary effect between VA and accounting conservatism for firms with more information asymmetry. But on the contrary, we find that the VA decision and accounting conservatism have a significant negative relationship in firms with professional managers-controlled, meaning that there is a substitution effect between VA and accounting conservatism for firms with less information asymmetry. Finally, the accounting conservatism of consolidated financial statements decrease after the adoption of IFRSs (International Financial Reporting Standards) in Taiwan. It means that the disclosure and transparency of consolidated financial statements had be improved.Keywords: voluntary audit, accounting conservatism, audit quality, information asymmetry
Procedia PDF Downloads 22611916 Co-Seismic Deformation Using InSAR Sentinel-1A: Case Study of the 6.5 Mw Pidie Jaya, Aceh, Earthquake
Authors: Jefriza, Habibah Lateh, Saumi Syahreza
Abstract:
The 2016 Mw 6.5 Pidie Jaya earthquake is one of the biggest disasters that has occurred in Aceh within the last five years. This earthquake has caused severe damage to many infrastructures such as schools, hospitals, mosques, and houses in the district of Pidie Jaya and surrounding areas. Earthquakes commonly occur in Aceh Province due to the Aceh-Sumatra is located in the convergent boundaries of the Sunda Plate subducted beneath the Indo-Australian Plate. This convergence is responsible for the intensification of seismicity in this region. The plates are tilted at a speed of 63 mm per year and the right lateral component is accommodated by strike- slip faulting within Sumatra, mainly along the great Sumatran fault. This paper presents preliminary findings of InSAR study aimed at investigating the co-seismic surface deformation pattern in Pidie Jaya, Aceh-Indonesia. Coseismic surface deformation is rapid displacement that occurs at the time of an earthquake. Coseismic displacement mapping is required to study the behavior of seismic faults. InSAR is a powerful tool for measuring Earth surface deformation to a precision of a few centimetres. In this study, two radar images of the same area but at two different times are required to detect changes in the Earth’s surface. The ascending and descending Sentinel-1A (S1A) synthetic aperture radar (SAR) data and Sentinels application platform (SNAP) toolbox were used to generate SAR interferogram image. In order to visualize the InSAR interferometric, the S1A from both master (26 Nov 2016) and slave data-sets (26 Dec 2016) were utilized as the main data source for mapping the coseismic surface deformation. The results show that the fringes of phase difference have appeared in the border region as a result of the movement that was detected with interferometric technique. On the other hand, the dominant fringes pattern also appears near the coastal area, this is consistent with the field investigations two days after the earthquake. However, the study has also limitations of resolution and atmospheric artefacts in SAR interferograms. The atmospheric artefacts are caused by changes in the atmospheric refractive index of the medium, as a result, has limitation to produce coherence image. Low coherence will be affected the result in creating fringes (movement can be detected by fringes). The spatial resolution of the Sentinel satellite has not been sufficient for studying land surface deformation in this area. Further studies will also be investigated using both ALOS and TerraSAR-X. ALOS and TerraSAR-X improved the spatial resolution of SAR satellite.Keywords: earthquake, InSAR, interferometric, Sentinel-1A
Procedia PDF Downloads 19611915 A Research on Inference from Multiple Distance Variables in Hedonic Regression Focus on Three Variables
Authors: Yan Wang, Yasushi Asami, Yukio Sadahiro
Abstract:
In urban context, urban nodes such as amenity or hazard will certainly affect house price, while classic hedonic analysis will employ distance variables measured from each urban nodes. However, effects from distances to facilities on house prices generally do not represent the true price of the property. Distance variables measured on the same surface are suffering a problem called multicollinearity, which is usually presented as magnitude variance and mean value in regression, errors caused by instability. In this paper, we provided a theoretical framework to identify and gather the data with less bias, and also provided specific sampling method on locating the sample region to avoid the spatial multicollinerity problem in three distance variable’s case.Keywords: hedonic regression, urban node, distance variables, multicollinerity, collinearity
Procedia PDF Downloads 46511914 How Virtualization, Decentralization, and Network-Building Change the Manufacturing Landscape: An Industry 4.0 Perspective
Authors: Malte Brettel, Niklas Friederichsen, Michael Keller, Marius Rosenberg
Abstract:
The German manufacturing industry has to withstand an increasing global competition on product quality and production costs. As labor costs are high, several industries have suffered severely under the relocation of production facilities towards aspiring countries, which have managed to close the productivity and quality gap substantially. Established manufacturing companies have recognized that customers are not willing to pay large price premiums for incremental quality improvements. As a consequence, many companies from the German manufacturing industry adjust their production focusing on customized products and fast time to market. Leveraging the advantages of novel production strategies such as Agile Manufacturing and Mass Customization, manufacturing companies transform into integrated networks, in which companies unite their core competencies. Hereby, virtualization of the process- and supply-chain ensures smooth inter-company operations providing real-time access to relevant product and production information for all participating entities. Boundaries of companies deteriorate, as autonomous systems exchange data, gained by embedded systems throughout the entire value chain. By including Cyber-Physical-Systems, advanced communication between machines is tantamount to their dialogue with humans. The increasing utilization of information and communication technology allows digital engineering of products and production processes alike. Modular simulation and modeling techniques allow decentralized units to flexibly alter products and thereby enable rapid product innovation. The present article describes the developments of Industry 4.0 within the literature and reviews the associated research streams. Hereby, we analyze eight scientific journals with regards to the following research fields: Individualized production, end-to-end engineering in a virtual process chain and production networks. We employ cluster analysis to assign sub-topics into the respective research field. To assess the practical implications, we conducted face-to-face interviews with managers from the industry as well as from the consulting business using a structured interview guideline. The results reveal reasons for the adaption and refusal of Industry 4.0 practices from a managerial point of view. Our findings contribute to the upcoming research stream of Industry 4.0 and support decision-makers to assess their need for transformation towards Industry 4.0 practices.Keywords: Industry 4.0., mass customization, production networks, virtual process-chain
Procedia PDF Downloads 27711913 How Manufacturing Firm Manages Information Security: Need Pull and Technology Push Perspective
Authors: Geuna Kim, Sanghyun Kim
Abstract:
This study investigates various factors that may influence the ISM process, including the organization’s internal needs and external pressure, and examines the role of regulatory pressure in ISM development and performance. The 105 sets of data collected in a survey were tested against the research model using SEM. The results indicate that NP and TP had positive effects on the ISM process, except for perceived benefits. Regulatory pressure had a positive effect on the relationship between ISM awareness and ISM development and performance.Keywords: information security management, need pull, technology push, regulatory pressure
Procedia PDF Downloads 29711912 Monitoring of Forest Cover Dynamics in the High Atlas of Morocco (Zaouit Ahansal) Using Remote Sensing Techniques and GIS
Authors: Abdelaziz Moujane, Abedelali Boulli, Abdellah Ouigmane
Abstract:
The present work focuses on the assessment of forestlandscape changes in the region of ZaouitAhansal, usingmultitemporal satellite images at high spatial resolution.Severalremotesensingmethodswereappliednamely: The supervised classification algorithm and NDVI whichwerecombined in a GIS environment to quantify the extent and change in density of forest stands (holmoak, juniper, thya, Aleppo pine, crops, and others).The resultsobtainedshowedthat the forest of ZaouitAhansal has undergonesignificantdegradationresulting in a decrease in the area of juniper, cedar, and zeenoak, as well as an increase in the area of baresoil and agricultural land. The remotesensing data providedsatisfactoryresults for identifying and quantifying changes in forestcover. In addition, thisstudycould serve as a reference for the development of management strategies and restoration programs.Keywords: remote sensing, GIS, satellite image, NDVI, deforestation, zaouit ahansal
Procedia PDF Downloads 15311911 Stock Price Informativeness and Profit Warnings: Empirical Analysis
Authors: Adel Almasarwah
Abstract:
This study investigates the nature of association between profit warnings and stock price informativeness in the context of Jordan as an emerging country. The analysis is based on the response of stock price synchronicity to profit warnings percentages that have been published in Jordanian firms throughout the period spanning 2005–2016 in the Amman Stock Exchange. The standard of profit warnings indicators have related negatively to stock price synchronicity in Jordanian firms, meaning that firms with a high portion of profit warnings integrate with more firm-specific information into stock price. Robust regression was used rather than OLS as a parametric test to overcome the variances inflation factor (VIF) and heteroscedasticity issues recognised as having occurred during running the OLS regression; this enabled us to obtained stronger results that fall in line with our prediction that higher profit warning encourages firm investors to collect and process more firm-specific information than common market information.Keywords: Profit Warnings, Jordanian Firms, Stock Price Informativeness, Synchronicity
Procedia PDF Downloads 14211910 Survey on Data Security Issues Through Cloud Computing Amongst Sme’s in Nairobi County, Kenya
Authors: Masese Chuma Benard, Martin Onsiro Ronald
Abstract:
Businesses have been using cloud computing more frequently recently because they wish to take advantage of its advantages. However, employing cloud computing also introduces new security concerns, particularly with regard to data security, potential risks and weaknesses that could be exploited by attackers, and various tactics and strategies that could be used to lessen these risks. This study examines data security issues on cloud computing amongst sme’s in Nairobi county, Kenya. The study used the sample size of 48, the research approach was mixed methods, The findings show that data owner has no control over the cloud merchant's data management procedures, there is no way to ensure that data is handled legally. This implies that you will lose control over the data stored in the cloud. Data and information stored in the cloud may face a range of availability issues due to internet outages; this can represent a significant risk to data kept in shared clouds. Integrity, availability, and secrecy are all mentioned.Keywords: data security, cloud computing, information, information security, small and medium-sized firms (SMEs)
Procedia PDF Downloads 8411909 Content-Based Color Image Retrieval Based on the 2-D Histogram and Statistical Moments
Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed
Abstract:
In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach can overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.Keywords: 2-D histogram, statistical moments, indexing, similarity distance, histograms intersection
Procedia PDF Downloads 45711908 Light-Entropy Continuum Theory
Authors: Christopher Restall
Abstract:
field causing attraction between mixed charges of matter during charge exchanges with antimatter. This asymmetry is caused from none-trinary quark amount variation in matter and anti-matter during entropy progression. This document explains how a circularity critique exercise assessed scientific knowledge and develop a unified theory from the information collected. The circularity critique, creates greater intuition leaps than an individual would naturally, the information collected can be integrated and assessed thoroughly for correctness.Keywords: unified theory of everything, gravity, quantum gravity, standard model
Procedia PDF Downloads 4111907 Data and Biological Sharing Platforms in Community Health Programs: Partnership with Rural Clinical School, University of New South Wales and Public Health Foundation of India
Authors: Vivian Isaac, A. T. Joteeshwaran, Craig McLachlan
Abstract:
The University of New South Wales (UNSW) Rural Clinical School has a strategic collaborative focus on chronic disease and public health. Our objectives are to understand rural environmental and biological interactions in vulnerable community populations. The UNSW Rural Clinical School translational model is a spoke and hub network. This spoke and hub model connects rural data and biological specimens with city based collaborative public health research networks. Similar spoke and hub models are prevalent across research centers in India. The Australia-India Council grant was awarded so we could establish sustainable public health and community research collaborations. As part of the collaborative network we are developing strategies around data and biological sharing platforms between Indian Institute of Public Health, Public Health Foundation of India (PHFI), Hyderabad and Rural Clinical School UNSW. The key objective is to understand how research collaborations are conducted in India and also how data can shared and tracked with external collaborators such as ourselves. A framework to improve data sharing for research collaborations, including DNA was proposed as a project outcome. The complexities of sharing biological data has been investigated via a visit to India. A flagship sustainable project between Rural Clinical School UNSW and PHFI would illustrate a model of data sharing platforms.Keywords: data sharing, collaboration, public health research, chronic disease
Procedia PDF Downloads 45011906 Ocean Planner: A Web-Based Decision Aid to Design Measures to Best Mitigate Underwater Noise
Authors: Thomas Folegot, Arnaud Levaufre, Léna Bourven, Nicolas Kermagoret, Alexis Caillard, Roger Gallou
Abstract:
Concern for negative impacts of anthropogenic noise on the ocean’s ecosystems has increased over the recent decades. This concern leads to a similar increased willingness to regulate noise-generating activities, of which shipping is one of the most significant. Dealing with ship noise requires not only knowledge about the noise from individual ships, but also how the ship noise is distributed in time and space within the habitats of concern. Marine mammals, but also fish, sea turtles, larvae and invertebrates are mostly dependent on the sounds they use to hunt, feed, avoid predators, during reproduction to socialize and communicate, or to defend a territory. In the marine environment, sight is only useful up to a few tens of meters, whereas sound can propagate over hundreds or even thousands of kilometers. Directive 2008/56/EC of the European Parliament and of the Council of June 17, 2008 called the Marine Strategy Framework Directive (MSFD) require the Member States of the European Union to take the necessary measures to reduce the impacts of maritime activities to achieve and maintain a good environmental status of the marine environment. The Ocean-Planner is a web-based platform that provides to regulators, managers of protected or sensitive areas, etc. with a decision support tool that enable to anticipate and quantify the effectiveness of management measures in terms of reduction or modification the distribution of underwater noise, in response to Descriptor 11 of the MSFD and to the Marine Spatial Planning Directive. Based on the operational sound modelling tool Quonops Online Service, Ocean-Planner allows the user via an intuitive geographical interface to define management measures at local (Marine Protected Area, Natura 2000 sites, Harbors, etc.) or global (Particularly Sensitive Sea Area) scales, seasonal (regulation over a period of time) or permanent, partial (focused to some maritime activities) or complete (all maritime activities), etc. Speed limit, exclusion area, traffic separation scheme (TSS), and vessel sound level limitation are among the measures supported be the tool. Ocean Planner help to decide on the most effective measure to apply to maintain or restore the biodiversity and the functioning of the ecosystems of the coastal seabed, maintain a good state of conservation of sensitive areas and maintain or restore the populations of marine species.Keywords: underwater noise, marine biodiversity, marine spatial planning, mitigation measures, prediction
Procedia PDF Downloads 12211905 Application of IoTs Based Multi-Level Air Quality Sensing for Advancing Environmental Monitoring in Pingtung County
Authors: Men An Pan, Hong Ren Chen, Chih Heng Shih, Hsing Yuan Yen
Abstract:
Pingtung County is located in the southernmost region of Taiwan. During the winter season, pollutants due to insufficient dispersion caused by the downwash of the northeast monsoon lead to the poor air quality of the County. Through the implementation of various control methods, including the application of permits of air pollution, fee collection of air pollution, control oil fume of catering sectors, smoke detection of diesel vehicles, regular inspection of locomotives, and subsidies for low-polluting vehicles. Moreover, to further mitigate the air pollution, additional alternative controlling strategies are also carried out, such as construction site control, prohibition of open-air agricultural waste burning, improvement of river dust, and strengthening of road cleaning operations. The combined efforts have significantly reduced air pollutants in the County. However, in order to effectively and promptly monitor the ambient air quality, the County has subsequently deployed micro-sensors, with a total of 400 IoTs (Internet of Things) micro-sensors for PM2.5 and VOC detection and 3 air quality monitoring stations of the Environmental Protection Agency (EPA), covering 33 townships of the County. The covered area has more than 1,300 listed factories and 5 major industrial parks; thus forming an Internet of Things (IoTs) based multi-level air quality monitoring system. The results demonstrate that the IoTs multi-level air quality sensors combined with other strategies such as “sand and gravel dredging area technology monitoring”, “banning open burning”, “intelligent management of construction sites”, “real-time notification of activation response”, “nighthawk early bird plan with micro-sensors”, “unmanned aircraft (UAV) combined with land and air to monitor abnormal emissions”, and “animal husbandry odour detection service” etc. The satisfaction improvement rate of air control, through a 2021 public survey, reached a high percentage of 81%, an increase of 46% as compared to 2018. For the air pollution complaints for the whole year of 2021, the total number was 4213 in contrast to 7088 in 2020, a reduction rate reached almost 41%. Because of the spatial-temporal features of the air quality monitoring IoTs system by the application of microsensors, the system does assist and strengthen the effectiveness of the existing air quality monitoring network of the EPA and can provide real-time control of the air quality. Therefore, the hot spots and potential pollution locations can be timely determined for law enforcement. Hence, remarkable results were obtained for the two years. That is, both reduction of public complaints and better air quality are successfully achieved through the implementation of the present IoTs system for real-time air quality monitoring throughout Pingtung County.Keywords: IoT, PM, air quality sensor, air pollution, environmental monitoring
Procedia PDF Downloads 7311904 Identification and Understanding of Colloidal Destabilization Mechanisms in Geothermal Processes
Authors: Ines Raies, Eric Kohler, Marc Fleury, Béatrice Ledésert
Abstract:
In this work, the impact of clay minerals on the formation damage of sandstone reservoirs is studied to provide a better understanding of the problem of deep geothermal reservoir permeability reduction due to fine particle dispersion and migration. In some situations, despite the presence of filters in the geothermal loop at the surface, particles smaller than the filter size (<1 µm) may surprisingly generate significant permeability reduction affecting in the long term the overall performance of the geothermal system. Our study is carried out on cores from a Triassic reservoir in the Paris Basin (Feigneux, 60 km Northeast of Paris). Our goal is to first identify the clays responsible for clogging, a mineralogical characterization of these natural samples was carried out by coupling X-Ray Diffraction (XRD), Scanning Electron Microscopy (SEM) and Energy Dispersive X-ray Spectroscopy (EDS). The results show that the studied stratigraphic interval contains mostly illite and chlorite particles. Moreover, the spatial arrangement of the clays in the rocks as well as the morphology and size of the particles, suggest that illite is more easily mobilized than chlorite by the flow in the pore network. Thus, based on these results, illite particles were prepared and used in core flooding in order to better understand the factors leading to the aggregation and deposition of this type of clay particles in geothermal reservoirs under various physicochemical and hydrodynamic conditions. First, the stability of illite suspensions under geothermal conditions has been investigated using different characterization techniques, including Dynamic Light Scattering (DLS) and Scanning Transmission Electron Microscopy (STEM). Various parameters such as the hydrodynamic radius (around 100 nm), the morphology and surface area of aggregates were measured. Then, core-flooding experiments were carried out using sand columns to mimic the permeability decline due to the injection of illite-containing fluids in sandstone reservoirs. In particular, the effects of ionic strength, temperature, particle concentration and flow rate of the injected fluid were investigated. When the ionic strength increases, a permeability decline of more than a factor of 2 could be observed for pore velocities representative of in-situ conditions. Further details of the retention of particles in the columns were obtained from Magnetic Resonance Imaging and X-ray Tomography techniques, showing that the particle deposition is nonuniform along the column. It is clearly shown that very fine particles as small as 100 nm can generate significant permeability reduction under specific conditions in high permeability porous media representative of the Triassic reservoirs of the Paris basin. These retention mechanisms are explained in the general framework of the DLVO theoryKeywords: geothermal energy, reinjection, clays, colloids, retention, porosity, permeability decline, clogging, characterization, XRD, SEM-EDS, STEM, DLS, NMR, core flooding experiments
Procedia PDF Downloads 17611903 The Design of Information Technology System for Traceability of Thailand’s Tubtimjun Roseapple
Authors: Pimploi Tirastittam, Phutthiwat Waiyawuththanapoom, Sawanath Treesathon
Abstract:
As there are several countries which import agriculture product from Thailand, those countries demand Thailand to establish the traceability system. The traceability system is the tool to reduce the risk in the supply chain in a very effective way as it will help the stakeholder in the supply chain to identify the defect point which will reduce the cost of operation in the supply chain. This research is aimed to design the traceability system for Tubtimjun roseapple for exporting to China, and it is the qualitative research. The data was collected from the expert in the tuntimjun roseapple and fruit exporting industry, and the data was used to design the traceability system. The design of the tubtimjun roseapple traceability system was followed the theory of supply chain which starts from the upstream of the supply chain to the downstream of the supply chain to support the process and condition of the exporting which included the database designing, system architecture, user interface design and information technology of the traceability system.Keywords: design information, technology system, traceability, tubtimjun roseapple
Procedia PDF Downloads 17011902 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform
Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry
Abstract:
The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems
Procedia PDF Downloads 49011901 An Informative Marketing Platform: Methodology and Architecture
Authors: Martina Marinelli, Samanta Vellante, Francesco Pilotti, Daniele Di Valerio, Gaetanino Paolone
Abstract:
Any development in web marketing technology requires changes in information engineering to identify instruments and techniques suitable for the production of software applications for informative marketing. Moreover, for large web solutions, designing an interface that enables human interactions is a complex process that must bridge between informative marketing requirements and the developed solution. A user-friendly interface in web marketing applications is crucial for a successful business. The paper introduces mkInfo - a software platform that implements informative marketing. Informative marketing is a new interpretation of marketing which places the information at the center of every marketing action. The creative team includes software engineering researchers who have recently authored an article on automatic code generation. The authors have created the mkInfo software platform to generate informative marketing web applications. For each web application, it is possible to automatically implement an opt in page, a landing page, a sales page, and a thank you page: one only needs to insert the content. mkInfo implements an autoresponder to send mail according to a predetermined schedule. The mkInfo platform also includes e-commerce for a product or service. The stakeholder can access any opt-in page and get basic information about a product or service. If he wants to know more, he will need to provide an e-mail address to access a landing page that will generate an e-mail sequence. It will provide him with complete information about the product or the service. From this point on, the stakeholder becomes a user and is now able to purchase the product or related services through the mkInfo platform. This paper suggests a possible definition for Informative Marketing, illustrates its basic principles, and finally details the mkInfo platform that implements it. This paper also offers some Informative Marketing models, which are implemented in the mkInfo platform. Informative marketing can be applied to products or services. It is necessary to realize a web application for each product or service. The mkInfo platform enables the product or the service producer to send information concerning a specific product or service to all stakeholders. In conclusion, the technical contributions of this paper are: a different interpretation of marketing based on information; a modular architecture for web applications, particularly for one with standard features such as information storage, exchange, and delivery; multiple models to implement informative marketing; a software platform enabling the implementation of such models in a web application. Future research aims to enable stakeholders to provide information about a product or a service so that the information gathered about a product or a service includes both the producer’s and the stakeholders' point of view. The purpose is to create an all-inclusive management system of the knowledge regarding a specific product or service: a system that includes everything about the product or service and is able to address even unexpected questions.Keywords: informative marketing, opt in page, software platform, web application
Procedia PDF Downloads 12711900 Evaluation of Symptoms, Laboratory Findings, and Natural History of IgE Mediated Wheat Allergy
Authors: Soudeh Tabashi, Soudabeh Fazeli Dehkordy, Masood Movahedi, Nasrin Behniafard
Abstract:
Introduction: Food allergy has increased in three last decades. Since wheat is one of the major constituents of daily meal in many regions throughout the world, wheat allergy is one of the most important allergies ranking among the 8 most common types of food allergies. Our information about epidemiology and etiology of food allergies are limited. Therefore, in this study we sought to evaluate the symptoms and laboratory findings in children with wheat allergy. Materials and methods: There were 23 patients aged up to 18 with the diagnosis of IgE mediated wheat allergy that were included enrolled in this study. Using a questionnaire .we collected their information and organized them into 4 groups categories of: demographic data identification, signs and symptoms, comorbidities, and laboratory data. Then patients were followed up for 6 month and their lab data were compared together. Results: Most of the patients (82%) presented the symptoms of wheat allergy in the first year of their life. The skin and the respiratory system were the most commonly involved organs with an incidence of 86% and 78% respectively. Most of the patients with wheat allergy were also sensitive to the other type of foods and their sensitivity to egg were most common type (47%). in 57% of patients, IgE levels were decreased during the 6 month follow-up period. Conclusion: We do not have enough information about data on epidemiology and response to therapy of wheat allergy and to best of our knowledge no study has addressed this issue in Iran so far. This study is the first source of information about IgE mediated wheat allergy in Iran and It can provide an opening for future studies about wheat allergy and its treatments.Keywords: wheat allergy, food allergy, IgE, food allergy
Procedia PDF Downloads 19411899 A Case Study Approach on Co-Constructing the Idea of 'Safety' with Children
Authors: Beng Zhen Yeow
Abstract:
In most work that involves children, the voice of the children is often not heard. This is ironic since a lot of discussions might involve their welfare and safety. It might seem natural that the professionals should hear from them about what they wish for instead of deciding what is best for them. However, this, unfortunately, might be more the exception than the norm in most case and hence in many instances, children are merely 'subjects' in conversations about safety instead of active participants in the construction or creation of safety in the family. There might be many reasons why it does not happen in our work. Firstly, professionals have learnt how to 'socialise' into their professional roles and hence in the process become 'un-childlike'. Secondly, there is also a lack of professional training with regards to how to talk with children. Finally, there might be also a lack of concrete tools and techniques that are developed to facilitate the process. In this paper, the case study method is used to show how the idea of safety could be concretised and discussed with children and their family members, and hence making them active participants and co-creators of their own safety. Specific skills and techniques are highlighted through the case study. In this case, there was improvement in outcomes like no repeated offence or abuse. In addition, children were also able to advocate for their own safety after six months of intervention and how the family members were able to explicitly say what they can do to improve safety. The professionals in the safety network reported significant improvements. On top of that, the abused child who was removed due to child protection concerns, had verbalized observations of change in mother’s parenting abilities, and has requested for home leave to begin due to ownership of safety planning and having confidence to co-create safety for her siblings and herself together with the professionals in the safety network. Children becoming active participants in the co-creation of safety not only serve the purpose in allowing them to own a 'voice' but at the same time, give them greater confidence to protect themselves at home and in other contexts outside of home.Keywords: partnering for safety, collaborative social work, family and systemic psychotherapy, child protection
Procedia PDF Downloads 12011898 Forecasting Model to Predict Dengue Incidence in Malaysia
Authors: W. H. Wan Zakiyatussariroh, A. A. Nasuhar, W. Y. Wan Fairos, Z. A. Nazatul Shahreen
Abstract:
Forecasting dengue incidence in a population can provide useful information to facilitate the planning of the public health intervention. Many studies on dengue cases in Malaysia were conducted but are limited in modeling the outbreak and forecasting incidence. This article attempts to propose the most appropriate time series model to explain the behavior of dengue incidence in Malaysia for the purpose of forecasting future dengue outbreaks. Several seasonal auto-regressive integrated moving average (SARIMA) models were developed to model Malaysia’s number of dengue incidence on weekly data collected from January 2001 to December 2011. SARIMA (2,1,1)(1,1,1)52 model was found to be the most suitable model for Malaysia’s dengue incidence with the least value of Akaike information criteria (AIC) and Bayesian information criteria (BIC) for in-sample fitting. The models further evaluate out-sample forecast accuracy using four different accuracy measures. The results indicate that SARIMA (2,1,1)(1,1,1)52 performed well for both in-sample fitting and out-sample evaluation.Keywords: time series modeling, Box-Jenkins, SARIMA, forecasting
Procedia PDF Downloads 48511897 Lean Comic GAN (LC-GAN): a Light-Weight GAN Architecture Leveraging Factorized Convolution and Teacher Forcing Distillation Style Loss Aimed to Capture Two Dimensional Animated Filtered Still Shots Using Mobile Phone Camera and Edge Devices
Authors: Kaustav Mukherjee
Abstract:
In this paper we propose a Neural Style Transfer solution whereby we have created a Lightweight Separable Convolution Kernel Based GAN Architecture (SC-GAN) which will very useful for designing filter for Mobile Phone Cameras and also Edge Devices which will convert any image to its 2D ANIMATED COMIC STYLE Movies like HEMAN, SUPERMAN, JUNGLE-BOOK. This will help the 2D animation artist by relieving to create new characters from real life person's images without having to go for endless hours of manual labour drawing each and every pose of a cartoon. It can even be used to create scenes from real life images.This will reduce a huge amount of turn around time to make 2D animated movies and decrease cost in terms of manpower and time. In addition to that being extreme light-weight it can be used as camera filters capable of taking Comic Style Shots using mobile phone camera or edge device cameras like Raspberry Pi 4,NVIDIA Jetson NANO etc. Existing Methods like CartoonGAN with the model size close to 170 MB is too heavy weight for mobile phones and edge devices due to their scarcity in resources. Compared to the current state of the art our proposed method which has a total model size of 31 MB which clearly makes it ideal and ultra-efficient for designing of camera filters on low resource devices like mobile phones, tablets and edge devices running OS or RTOS. .Owing to use of high resolution input and usage of bigger convolution kernel size it produces richer resolution Comic-Style Pictures implementation with 6 times lesser number of parameters and with just 25 extra epoch trained on a dataset of less than 1000 which breaks the myth that all GAN need mammoth amount of data. Our network reduces the density of the Gan architecture by using Depthwise Separable Convolution which does the convolution operation on each of the RGB channels separately then we use a Point-Wise Convolution to bring back the network into required channel number using 1 by 1 kernel.This reduces the number of parameters substantially and makes it extreme light-weight and suitable for mobile phones and edge devices. The architecture mentioned in the present paper make use of Parameterised Batch Normalization Goodfellow etc al. (Deep Learning OPTIMIZATION FOR TRAINING DEEP MODELS page 320) which makes the network to use the advantage of Batch Norm for easier training while maintaining the non-linear feature capture by inducing the learnable parametersKeywords: comic stylisation from camera image using GAN, creating 2D animated movie style custom stickers from images, depth-wise separable convolutional neural network for light-weight GAN architecture for EDGE devices, GAN architecture for 2D animated cartoonizing neural style, neural style transfer for edge, model distilation, perceptual loss
Procedia PDF Downloads 13211896 Automated Distribution System Management: Substation Remote Diagnostic and Operation Solution for Obafemi Awolowo University
Authors: Aderonke Oluseun Akinwumi, Olusola A. Komolaf
Abstract:
This paper gives information about the wide array of challenges facing both the electric utilities and consumers in the distribution system in developing countries, using Obafemi Awolowo University, Ile-Ife Nigeria as a case study. It also proffers cost-effective solution through remote monitoring, diagnostic and operation of distribution networks without compromising the system reliability. As utilities move from manned and unintelligent networks to completely unmanned smart grids, switching activities at substations and feeders will be managed and controlled remotely by dedicated systems hence this design. The Substation Remote Diagnostic and Operation Solution (sRDOs) would remotely monitor the load on Medium Voltage (MV) and Low Voltage (LV) feeders as well as distribution transformers and allow the utility disconnect non-paying customers with absolutely no extra resource deployment and without interrupting supply to paying customers. The aftermath of the implementation of this design improved the lifetime of key distribution infrastructure by automatically isolating feeders during overload conditions and more importantly erring consumers. This increased the ratio of revenue generated on electricity bills to total network load.Keywords: electric utility, consumers, remote monitoring, diagnostic, system reliability, manned and unintelligent networks, unmanned smart grids, switching activities, medium voltage, low voltage, distribution transformer
Procedia PDF Downloads 130