Search results for: real-world data
22224 Simulation of Forest Fire Using Wireless Sensor Network
Authors: Mohammad F. Fauzi, Nurul H. Shahba M. Shahrun, Nurul W. Hamzah, Mohd Noah A. Rahman, Afzaal H. Seyal
Abstract:
In this paper, we proposed a simulation system using Wireless Sensor Network (WSN) that will be distributed around the forest for early forest fire detection and to locate the areas affected. In Brunei Darussalam, approximately 78% of the nation is covered by forest. Since the forest is Brunei’s most precious natural assets, it is very important to protect and conserve our forest. The hot climate in Brunei Darussalam can lead to forest fires which can be a fatal threat to the preservation of our forest. The process consists of getting data from the sensors, analyzing the data and producing an alert. The key factors that we are going to analyze are the surrounding temperature, wind speed and wind direction, humidity of the air and soil.Keywords: forest fire monitor, humidity, wind direction, wireless sensor network
Procedia PDF Downloads 45422223 Evaluation of Polyurethane-Bonded Particleboard Manufactured with Eucalyptus Sp. and Bi-Oriented Polypropylene Wastes
Authors: Laurenn Borges de Macedo, Fabiane Salles Ferro, Tiago Hendrigo de Almeida, Gérson Moreira de Lima, André Luiz Christoforo, Francisco Antonio Rocco Lahr
Abstract:
The growth of the furniture manufacturing industry is one of the fundamental factors contributing to the growth of the particleboard industry. The use of recycled products into particleboards can contribute to the forest conservation, in addition to achieve a high quality sustainable product with low-cost production. This work investigates the effect of bi-oriented polypropylene (BOPP) waste particles and sealing product on the physical and mechanical properties of Eucalyptus sp. particleboards fabricated with a castor oil based polyurethane resin. Among the factors, only the seal coating was statistically significant. The wood panels of Treatment 2 were classified as H1, based on the internal bond strength and elastic modulus results data required by ANSI A208.1:1999. The bending strength data did not reach the minimum values recommended by NBR 14810:2006 and ANSI A208.1:1999. The thickness swelling data for 2h immersed in water achieved the standard requirement levels. High-density panels were achieved revealing their potential use in variety of particleboard applications.Keywords: BOPP, mechanical properties, particleboards, physical properties
Procedia PDF Downloads 37222222 Specification of Requirements to Ensure Proper Implementation of Security Policies in Cloud-Based Multi-Tenant Systems
Authors: Rebecca Zahra, Joseph G. Vella, Ernest Cachia
Abstract:
The notion of cloud computing is rapidly gaining ground in the IT industry and is appealing mostly due to making computing more adaptable and expedient whilst diminishing the total cost of ownership. This paper focuses on the software as a service (SaaS) architecture of cloud computing which is used for the outsourcing of databases with their associated business processes. One approach for offering SaaS is basing the system’s architecture on multi-tenancy. Multi-tenancy allows multiple tenants (users) to make use of the same single application instance. Their requests and configurations might then differ according to specific requirements met through tenant customisation through the software. Despite the known advantages, companies still feel uneasy to opt for the multi-tenancy with data security being a principle concern. The fact that multiple tenants, possibly competitors, would have their data located on the same server process and share the same database tables heighten the fear of unauthorised access. Security is a vital aspect which needs to be considered by application developers, database administrators, data owners and end users. This is further complicated in cloud-based multi-tenant system where boundaries must be established between tenants and additional access control models must be in place to prevent unauthorised cross-tenant access to data. Moreover, when altering the database state, the transactions need to strictly adhere to the tenant’s known business processes. This paper focuses on the fact that security in cloud databases should not be considered as an isolated issue. Rather it should be included in the initial phases of the database design and monitored continuously throughout the whole development process. This paper aims to identify a number of the most common security risks and threats specifically in the area of multi-tenant cloud systems. Issues and bottlenecks relating to security risks in cloud databases are surveyed. Some techniques which might be utilised to overcome them are then listed and evaluated. After a description and evaluation of the main security threats, this paper produces a list of software requirements to ensure that proper security policies are implemented by a software development team when designing and implementing a multi-tenant based SaaS. This would then assist the cloud service providers to define, implement, and manage security policies as per tenant customisation requirements whilst assuring security for the customers’ data.Keywords: cloud computing, data management, multi-tenancy, requirements, security
Procedia PDF Downloads 15622221 The Environmental Impact of Wireless Technologies in Nigeria: An Overview of the IoT and 5G Network
Authors: Powei Happiness Kerry
Abstract:
Introducing wireless technologies in Nigeria have improved the quality of lives of Nigerians, however, not everyone sees it in that light. The paper on the environmental impact of wireless technologies in Nigeria summarizes the scholarly views on the impact of wireless technologies on the environment, beaming its searchlight on 5G and internet of things in Nigeria while also exploring the theory of the Technology Acceptance Model (TAM). The study used a qualitative research method to gather important data from relevant sources and contextually draws inference from the derived data. The study concludes that the Federal Government of Nigeria, before agreeing to any latest development in the world of wireless technologies, should weigh the implications and deliberate extensively with all stalk holders putting into consideration the confirmation it will receive from the National Assembly.Keywords: Internet of Things, radiofrequency, electromagnetic radiation, information and communications technology, ICT, 5G
Procedia PDF Downloads 13422220 Analysis of Global Social Responsibilities of Social Studies Pre-Service Teachers Based on Several Variables
Authors: Zafer Cakmak, Birol Bulut, Cengiz Taskiran
Abstract:
Technological advances, the world becoming smaller and increasing world population increase our interdependence with individuals that we maybe never meet face to face. It is impossible for the modern individuals to escape global developments and their impact. Furthermore, it is very unlikely for the global societies to turn back from the path they are in. These effects of globalization in fact encumber the humankind at a certain extend. We succumb to these responsibilities for we desire a better future, a habitable world and a more peaceful life. In the present study, global responsibility levels of the participants were measured and the significance of global reactions that individuals have to develop on global issues was reinterpreted under the light of the existing literature. The study was conducted with general survey model, one of the survey methodologies General survey models are surveys conducted on the whole universe or a group, sample or sampling taken from the universe to arrive at a conclusion about the universe, which includes a high number of elements. The study was conducted with data obtained from 350 pre-service teachers attending 2016 spring semester to determine 'Global Social Responsibility' levels of social studies pre-service teachers based on several variables. Collected data were analyzed using SPSS 21.0 software. T-test and ANOVA were utilized in the data analysis.Keywords: social studies, globalization, global social responsibility, education
Procedia PDF Downloads 39022219 Design and Evaluation of Production Performance Dashboard for Achieving Oil and Gas Production Target
Authors: Ivan Ramos Sampe Immanuel, Linung Kresno Adikusumo, Liston Sitanggang
Abstract:
Achieving the production targets of oil and gas in an upstream oil and gas company represents a complex undertaking necessitating collaborative engagement from a multidisciplinary team. In addition to conducting exploration activities and executing well intervention programs, an upstream oil and gas enterprise must assess the feasibility of attaining predetermined production goals. The monitoring of production performance serves as a critical activity to ensure organizational progress towards the established oil and gas performance targets. Subsequently, decisions within the upstream oil and gas management team are informed by the received information pertaining to the respective production performance. To augment the decision-making process, the implementation of a production performance dashboard emerges as a viable solution, providing an integrated and centralized tool. The deployment of a production performance dashboard manifests as an instrumental mechanism fostering a user-friendly interface for monitoring production performance, while concurrently preserving the intrinsic characteristics of granular data. The integration of diverse data sources into a unified production performance dashboard establishes a singular veritable source, thereby enhancing the organization's capacity to uphold a consolidated and authoritative foundation for its business requisites. Additionally, the heightened accessibility of the production performance dashboard to business users constitutes a compelling substantiation of its consequential impact on facilitating the monitoring of organizational targets.Keywords: production, performance, dashboard, data analytics
Procedia PDF Downloads 7122218 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing
Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah
Abstract:
The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing
Procedia PDF Downloads 42922217 European Food Safety Authority (EFSA) Safety Assessment of Food Additives: Data and Methodology Used for the Assessment of Dietary Exposure for Different European Countries and Population Groups
Authors: Petra Gergelova, Sofia Ioannidou, Davide Arcella, Alexandra Tard, Polly E. Boon, Oliver Lindtner, Christina Tlustos, Jean-Charles Leblanc
Abstract:
Objectives: To assess chronic dietary exposure to food additives in different European countries and population groups. Method and Design: The European Food Safety Authority’s (EFSA) Panel on Food Additives and Nutrient Sources added to Food (ANS) estimates chronic dietary exposure to food additives with the purpose of re-evaluating food additives that were previously authorized in Europe. For this, EFSA uses concentration values (usage and/or analytical occurrence data) reported through regular public calls for data by food industry and European countries. These are combined, at individual level, with national food consumption data from the EFSA Comprehensive European Food Consumption Database including data from 33 dietary surveys from 19 European countries and considering six different population groups (infants, toddlers, children, adolescents, adults and the elderly). EFSA ANS Panel estimates dietary exposure for each individual in the EFSA Comprehensive Database by combining the occurrence levels per food group with their corresponding consumption amount per kg body weight. An individual average exposure per day is calculated, resulting in distributions of individual exposures per survey and population group. Based on these distributions, the average and 95th percentile of exposure is calculated per survey and per population group. Dietary exposure is assessed based on two different sets of data: (a) Maximum permitted levels (MPLs) of use set down in the EU legislation (defined as regulatory maximum level exposure assessment scenario) and (b) usage levels and/or analytical occurrence data (defined as refined exposure assessment scenario). The refined exposure assessment scenario is sub-divided into the brand-loyal consumer scenario and the non-brand-loyal consumer scenario. For the brand-loyal consumer scenario, the consumer is considered to be exposed on long-term basis to the highest reported usage/analytical level for one food group, and at the mean level for the remaining food groups. For the non-brand-loyal consumer scenario, the consumer is considered to be exposed on long-term basis to the mean reported usage/analytical level for all food groups. An additional exposure from sources other than direct addition of food additives (i.e. natural presence, contaminants, and carriers of food additives) is also estimated, as appropriate. Results: Since 2014, this methodology has been applied in about 30 food additive exposure assessments conducted as part of scientific opinions of the EFSA ANS Panel. For example, under the non-brand-loyal scenario, the highest 95th percentile of exposure to α-tocopherol (E 307) and ammonium phosphatides (E 442) was estimated in toddlers up to 5.9 and 8.7 mg/kg body weight/day, respectively. The same estimates under the brand-loyal scenario in toddlers resulted in exposures of 8.1 and 20.7 mg/kg body weight/day, respectively. For the regulatory maximum level exposure assessment scenario, the highest 95th percentile of exposure to α-tocopherol (E 307) and ammonium phosphatides (E 442) was estimated in toddlers up to 11.9 and 30.3 mg/kg body weight/day, respectively. Conclusions: Detailed and up-to-date information on food additive concentration values (usage and/or analytical occurrence data) and food consumption data enable the assessment of chronic dietary exposure to food additives to more realistic levels.Keywords: α-tocopherol, ammonium phosphatides, dietary exposure assessment, European Food Safety Authority, food additives, food consumption data
Procedia PDF Downloads 32622216 Recommendations as a Key Aspect for Online Learning Personalization: Perceptions of Teachers and Students
Authors: N. Ipiña, R. Basagoiti, O. Jimenez, I. Arriaran
Abstract:
Higher education students are increasingly enrolling in online courses, they are, at the same time, generating data about their learning process in the courses. Data collected in those technology enhanced learning spaces can be used to identify patterns and therefore, offer recommendations/personalized courses to future online students. Moreover, recommendations are considered key aspects for personalization in online learning. Taking into account the above mentioned context, the aim of this paper is to explore the perception of higher education students and teachers towards receiving recommendations in online courses. The study was carried out with 322 students and 10 teachers from two different faculties (Engineering and Education) from Mondragon University. Online questionnaires and face to face interviews were used to gather data from the participants. Results from the questionnaires show that most of the students would like to receive recommendations in their online courses as a guide in their learning process. Findings from the interviews also show that teachers see recommendations useful for their students’ learning process. However, teachers believe that specific pedagogical training is required. Conclusions can also be drawn as regards the importance of personalization in technology enhanced learning. These findings have significant implications for those who train online teachers due to the fact that pedagogy should be the driven force and further training on the topic could be required. Therefore, further research is needed to better understand the impact of recommendations on online students’ learning process and draw some conclusion on pedagogical concerns.Keywords: higher education, perceptions, recommendations, online courses
Procedia PDF Downloads 26722215 The Effect of Vertical Integration on Operational Performance: Evaluating Physician Employment in Hospitals
Authors: Gary Young, David Zepeda, Gilbert Nyaga
Abstract:
This study investigated whether vertical integration of hospitals and physicians is associated with better care for patients with cardiac conditions. A dramatic change in the U.S. hospital industry is the integration of hospital and physicians through hospital acquisition of physician practices. Yet, there is little evidence regarding whether this form of vertical integration leads to better operational performance of hospitals. The study was conducted as an observational investigation based on a pooled, cross-sectional database. The study sample comprised over hospitals in the State of California. The time frame for the study was 2010 to 2012. The key performance measure was hospitals’ degree of compliance with performance criteria set out by the federal government for managing patients with cardiac conditions. These criteria relate to the types of clinical tests and medications that hospitals should follow for cardiac patients but hospital compliance requires the cooperation of a hospital’s physicians. Data for this measure was obtained from a federal website that presents performance scores for U.S. hospitals. The key independent variable was the percentage of cardiologists that a hospital employs (versus cardiologists who are affiliated but not employed by the hospital). Data for this measure was obtained from the State of California which requires hospitals to report financial and operation data each year including numbers of employed physicians. Other characteristics of hospitals (e.g., information technology for cardiac care, volume of cardiac patients) were also evaluated as possible complements or substitutes for physician employment by hospitals. Additional sources of data included the American Hospital Association and the U.S. Census. Empirical models were estimated with generalized estimating equations (GEE). Findings suggest that physician employment is positively associated with better hospital performance for cardiac care. However, findings also suggest that information technology is a substitute for physician employment.Keywords: physician employment, hospitals, verical integration, cardiac care
Procedia PDF Downloads 39522214 Urbanization and Income Inequality in Thailand
Authors: Acumsiri Tantikarnpanit
Abstract:
This paper aims to examine the relationship between urbanization and income inequality in Thailand during the period 2002–2020. Using a panel of data for 76 provinces collected from Thailand’s National Statistical Office (Labor Force Survey: LFS), as well as geospatial data from the U.S. Air Force Defense Meteorological Satellite Program (DMSP) and the Visible Infrared Imaging Radiometer Suite Day/Night band (VIIRS-DNB) satellite for nineteen selected years. This paper employs two different definitions to identify urban areas: 1) Urban areas defined by Thailand's National Statistical Office (Labor Force Survey: LFS), and 2) Urban areas estimated using nighttime light data from the DMSP and VIIRS-DNB satellite. The second method includes two sub-categories: 2.1) Determining urban areas by calculating nighttime light density with a population density of 300 people per square kilometer, and 2.2) Calculating urban areas based on nighttime light density corresponding to a population density of 1,500 people per square kilometer. The empirical analysis based on Ordinary Least Squares (OLS), fixed effects, and random effects models reveals a consistent U-shaped relationship between income inequality and urbanization. The findings from the econometric analysis demonstrate that urbanization or population density has a significant and negative impact on income inequality. Moreover, the square of urbanization shows a statistically significant positive impact on income inequality. Additionally, there is a negative association between logarithmically transformed income and income inequality. This paper also proposes the inclusion of satellite imagery, geospatial data, and spatial econometric techniques in future studies to conduct quantitative analysis of spatial relationships.Keywords: income inequality, nighttime light, population density, Thailand, urbanization
Procedia PDF Downloads 7622213 Work Related Musculoskeletal Disorder: A Case Study of Office Computer Users in Nigerian Content Development and Monitoring Board, Yenagoa, Bayelsa State, Nigeria
Authors: Tamadu Perry Egedegu
Abstract:
Rapid growth in the use of electronic data has affected both the employee and work place. Our experience shows that jobs that have multiple risk factors have a greater likelihood of causing Work Related Musculoskeletal Disorder (WRMSDs), depending on the duration, frequency and/or magnitude of exposure to each. The study investigated musculoskeletal disorder among office workers. Thus, it is important that ergonomic risk factors be considered in light of their combined effect in causing or contributing to WRMSDs. Fast technological growth in the use of electronic system; have affected both workers and the work environment. Awkward posture and long hours in front of these visual display terminals can result in work-related musculoskeletal disorders (WRMSD). The study shall contribute to the awareness creation on the causes and consequences of WRMSDs due to lack of ergonomics training. The study was conducted using an observational cross-sectional design. A sample of 109 respondents was drawn from the target population through purposive sampling method. The sources of data were both primary and secondary. Primary data were collected through questionnaires and secondary data were sourced from journals, textbooks, and internet materials. Questionnaires were the main instrument for data collection and were designed in a YES or NO format according to the study objectives. Content validity approval was used to ensure that the variables were adequately covered. The reliability of the instrument was done through test-retest method, yielding a reliability index at 0.84. The data collected from the field were analyzed with a descriptive statistics of chart, percentage and mean. The study found that the most affected body regions were the upper back, followed by the lower back, neck, wrist, shoulder and eyes, while the least affected body parts were the knee calf and the ankle. Furthermore, the prevalence of work-related 'musculoskeletal' malfunctioning was linked with long working hours (6 - 8 hrs.) per day, lack of back support on their seats, glare on the monitor, inadequate regular break, repetitive motion of the upper limbs, and wrist when using the computer. Finally, based on these findings some recommendations were made to reduce the prevalent of WRMSDs among office workers.Keywords: work related musculoskeletal disorder, Nigeria, office computer users, ergonomic risk factor
Procedia PDF Downloads 24122212 Bank Loans and the Business Cycle: The Case of the Czech Republic
Authors: Libena Cernohorska, Jan Cernohorsky
Abstract:
This article aims to evaluate the impact of loans provided within the Czech banking sector on the growth of the Czech economy. The article is based on research of current scientific findings in respect to bank loans and economic development. The paper is based on data taken from the Czech Statistical Office on the development of the gross domestic product and data from the Czech National Bank on the development of loans from the period 2004-2015. Links between selected variables are tested using Granger causality tests. The results calculated confirm the hypothesis of the impact of the loans on economic growth, with a six-month delay. The results thus correspond to the standard economic findings and results of most previous studies.Keywords: bank, business cycle, economic growth, loans
Procedia PDF Downloads 12622211 Automatic Differential Diagnosis of Melanocytic Skin Tumours Using Ultrasound and Spectrophotometric Data
Authors: Kristina Sakalauskiene, Renaldas Raisutis, Gintare Linkeviciute, Skaidra Valiukeviciene
Abstract:
Cutaneous melanoma is a melanocytic skin tumour, which has a very poor prognosis while is highly resistant to treatment and tends to metastasize. Thickness of melanoma is one of the most important biomarker for stage of disease, prognosis and surgery planning. In this study, we hypothesized that the automatic analysis of spectrophotometric images and high-frequency ultrasonic 2D data can improve differential diagnosis of cutaneous melanoma and provide additional information about tumour penetration depth. This paper presents the novel complex automatic system for non-invasive melanocytic skin tumour differential diagnosis and penetration depth evaluation. The system is composed of region of interest segmentation in spectrophotometric images and high-frequency ultrasound data, quantitative parameter evaluation, informative feature extraction and classification with linear regression classifier. The segmentation of melanocytic skin tumour region in ultrasound image is based on parametric integrated backscattering coefficient calculation. The segmentation of optical image is based on Otsu thresholding. In total 29 quantitative tissue characterization parameters were evaluated by using ultrasound data (11 acoustical, 4 shape and 15 textural parameters) and 55 quantitative features of dermatoscopic and spectrophotometric images (using total melanin, dermal melanin, blood and collagen SIAgraphs acquired using spectrophotometric imaging device SIAscope). In total 102 melanocytic skin lesions (including 43 cutaneous melanomas) were examined by using SIAscope and ultrasound system with 22 MHz center frequency single element transducer. The diagnosis and Breslow thickness (pT) of each MST were evaluated during routine histological examination after excision and used as a reference. The results of this study have shown that automatic analysis of spectrophotometric and high frequency ultrasound data can improve non-invasive classification accuracy of early-stage cutaneous melanoma and provide supplementary information about tumour penetration depth.Keywords: cutaneous melanoma, differential diagnosis, high-frequency ultrasound, melanocytic skin tumours, spectrophotometric imaging
Procedia PDF Downloads 27022210 Effect of Particles Size and Volume Fraction Concentration on the Thermal Conductivity and Thermal Diffusivity of Al2O3 Nanofluids Measured Using Transient Hot–Wire Laser Beam Deflection Technique
Authors: W. Mahmood Mat Yunus, Faris Mohammed Ali, Zainal Abidin Talib
Abstract:
In this study we present new data for the thermal conductivity enhancement in four nanofluids containing 11, 25, 50, 63 nm diameter aluminum oxide (Al2O3) nanoparticles in distilled water. The nanofluids were prepared using single step method (i.e. by dispersing nanoparticle directly in base fluid) which was gathered in ultrasonic device for approximately 7 hours. The transient hot-wire laser beam displacement technique was used to measure the thermal conductivity and thermal diffusivity of the prepared nanofluids. The thermal conductivity and thermal diffusivity were obtained by fitting the experimental data to the numerical data simulated for aluminum oxide in distilled water. The results show that the thermal conductivity and thermal diffusivity of nanofluids increases in non-linear behavior as the particle size increases. While, the thermal conductivity and thermal diffusivity of Al2O3 nanofluids was observed increasing linearly with concentration as the volume fraction concentration increases. We believe that the interfacial layer between solid/fluid is the main factor for the enhancement of thermal conductivity and thermal diffusivity of Al2O3 nanofluids in the present work.Keywords: transient hot wire-laser beam technique, Al2O3 nanofluid, particle size, volume fraction concentration
Procedia PDF Downloads 55322209 Hydrology and Hydraulics Analysis of Beko Abo Dam and Appurtenant Structre Design, Ethiopia
Authors: Azazhu Wassie
Abstract:
This study tried to evaluate the maximum design flood for appurtenance structure design using the given climatological and hydrological data analysis on the referenced study area. The maximum design flood is determined by using flood frequency analysis. Using this method, the peak discharge is 32,583.67 m3/s, but the data is transferred because the dam site is not on the gauged station. Then the peak discharge becomes 38,115 m3/s. The study was conducted in June 2023. This dam is built across a river to create a reservoir on its upstream side for impounding water. The water stored in the reservoir is used for various purposes, such as irrigation, hydropower, navigation, fishing, etc. The total average volume of annual runoff is estimated to be 115.1 billion m3. The total potential of the land for irrigation development can go beyond 3 million ha.Keywords: dam design, flow duration curve, peak flood, rainfall, reservoir capacity, risk and reliability
Procedia PDF Downloads 2922208 Feasibility Study of Wind Energy Potential in Turkey: Case Study of Catalca District in Istanbul
Authors: Mohammed Wadi, Bedri Kekezoglu, Mustafa Baysal, Mehmet Rida Tur, Abdulfetah Shobole
Abstract:
This paper investigates the technical evaluation of the wind potential for present and future investments in Turkey taking into account the feasibility of sites, installments, operation, and maintenance. This evaluation based on the hourly measured wind speed data for the three years 2008–2010 at 30 m height for Çatalca district. These data were obtained from national meteorology station in Istanbul–Republic of Turkey are analyzed in order to evaluate the feasibility of wind power potential and to assure supreme assortment of wind turbines installing for the area of interest. Furthermore, the data are extrapolated and analyzed at 60 m and 80 m regarding the variability of roughness factor. Weibull bi-parameter probability function is used to approximate monthly and annually wind potential and power density based on three calculation methods namely, the approximated, the graphical and the energy pattern factor methods. The annual mean wind power densities were to be 400.31, 540.08 and 611.02 W/m² for 30, 60, and 80 m heights respectively. Simulation results prove that the analyzed area is an appropriate place for constructing large-scale wind farms.Keywords: wind potential in Turkey, Weibull bi-parameter probability function, the approximated method, the graphical method, the energy pattern factor method, capacity factor
Procedia PDF Downloads 25922207 Using Geographic Information Systems in the Desertification Risk’s Cartography: Case South of the Aurès Region, Algeria
Authors: Benmessaoud Hassen
Abstract:
The sensitivity to the desertification map of the south of Aurès region has been elaborated by the crossing of four thematic layers capable to have an impact on the process of desertification. The following step is inspired of MEDALUS (Mediterranean desertification and land Use), which use qualitative index to define the environment zones sensitive to the desertification. The cartographical information of vegetation, the climate, the soil and the socioeconomic state descended from cartographic data transformed to numerical data then seized on, structured and managed by an algorithm dedicated to a geographical information system. In step with information, each layer makes object of 3 or 4 classes, the geometrical median of the four layers used are leaded to sensitivity classes (ISD) of different mapped environment.Keywords: information systems, thematic layers, the sensitivity to the desertification map, concept MEDALUS, South of Aurès
Procedia PDF Downloads 42322206 Electrical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: electrical disaggregation, DTW, general appliance modeling, event detection
Procedia PDF Downloads 7822205 Re-Engineering Management Process in IRAN’s Smart Schools
Authors: M. R. Babaei, S. M. Hosseini, S. Rahmani, L. Moradi
Abstract:
Today, the quality of education and training systems and the effectiveness of the education systems of most concern to stakeholders and decision-makers of our country's development in each country. In Iran this is a double issue of concern to numerous reasons; So that governments, over the past decade have hardly even paid the running costs of education. ICT is claiming it has the power to change the structure of a program for training, reduce costs and increase quality, and do education systems and products consistent with the needs of the community and take steps to practice education. Own of the areas that the introduction of information technology has fundamentally changed is the field of education. The aim of this research is process reengineering management in schools simultaneously has been using field studies to collect data in the form of interviews and a questionnaire survey. The statistical community of this research has been the country of Iran and smart schools under the education. Sampling was targeted. The data collection tool was a questionnaire composed of two parts. The questionnaire consists of 36 questions that each question designates one of effective factors on the management of smart schools. Also each question consists of two parts. The first part designates the operating position in the management process, which represents the domain's belonging to the management agent (planning, organizing, leading, controlling). According to the classification of Dabryn and in second part the factors affect the process of managing the smart schools were examined, that Likert scale is used to classify. Questions the validity of the group of experts and prominent university professors in the fields of information technology, management and reengineering of approved and Cronbach's alpha reliability and also with the use of the formula is evaluated and approved. To analyse the data, descriptive and inferential statistics were used to analyse the factors contributing to the rating of (Linkert scale) descriptive statistics (frequency table data, mean, median, mode) was used. To analyse the data using analysis of variance and nonparametric tests and Friedman test, the assumption was evaluated. The research conclusions show that the factors influencing the management process re-engineering smart schools in school performance is affected.Keywords: re-engineering, management process, smart school, Iran's school
Procedia PDF Downloads 24422204 The Methodology of Out-Migration in Georgia
Authors: Shorena Tsiklauri
Abstract:
Out-migration is an important issue for Georgia as well as since independence has loosed due to emigration one fifth of its population. During Soviet time out-migration from USSR was almost impossible and one of the most important instruments in regulating population movement within the Soviet Union was the system of compulsory residential registrations, so-called “propiska”. Since independent here was not any regulation for migration from Georgia. The majorities of Georgian migrants go abroad by tourist visa and then overstay, becoming the irregular labor migrants. The official statistics on migration published for this period was based on the administrative system of population registration, were insignificant in terms of numbers and did not represent the real scope of these migration movements. This paper discusses the data quality and methodology of migration statistics in Georgia and we are going to answer the questions: what is the real reason of increasing immigration flows according to the official numbers since 2000s?Keywords: data quality, Georgia, methodology, migration
Procedia PDF Downloads 41722203 Evaluation of Low Power Wi-Fi Modules in Simulated Ocean Environments
Authors: Gabriel Chenevert, Abhilash Arora, Zeljko Pantic
Abstract:
The major problem underwater acoustic communication faces is the low data rate due to low signal frequency. By contrast, the Wi-Fi communication protocol offers high throughput but limited operating range due to the attenuation effect of the sea and ocean medium. However, short-range near-field underwater wireless power transfer systems offer an environment where Wi-Fi communication can be effectively integrated to collect data and deliver instructions to sensors in underwater sensor networks. In this paper, low-power, low-cost off-the-shelf Wi-Fi modules are explored experimentally for four selected parameters for different distances between units and water salinities. The results reveal a shorter operating range and stronger dependence on water salinity than reported so far for high-end Wi-Fi modules.Keywords: Wi-Fi, wireless power transfer, underwater communications, ESP
Procedia PDF Downloads 11622202 Numerical Modelling of Wind Dispersal Seeds of Bromeliad Tillandsia recurvata L. (L.) Attached to Electric Power Lines
Authors: Bruna P. De Souza, Ricardo C. De Almeida
Abstract:
In some cities in the State of Parana – Brazil and in other countries atmospheric bromeliads (Tillandsia spp - Bromeliaceae) are considered weeds in trees, electric power lines, satellite dishes and other artificial supports. In this study, a numerical model was developed to simulate the seed dispersal of the Tillandsia recurvata species by wind with the objective of evaluating seeds displacement in the city of Ponta Grossa – PR, Brazil, since it is considered that the region is already infested. The model simulates the dispersal of each individual seed integrating parameters from the atmospheric boundary layer (ABL) and the local wind, simulated by the Weather Research Forecasting (WRF) mesoscale atmospheric model for the 2012 to 2015 period. The dispersal model also incorporates the approximate number of bromeliads and source height data collected from most infested electric power lines. The seeds terminal velocity, which is an important input data but was not available in the literature, was measured by an experiment with fifty-one seeds of Tillandsia recurvata. Wind is the main dispersal agent acting on plumed seeds whereas atmospheric turbulence is a determinant factor to transport the seeds to distances beyond 200 meters as well as to introduce random variability in the seed dispersal process. Such variability was added to the model through the application of an Inverse Fast Fourier Transform to wind velocity components energy spectra based on boundary-layer meteorology theory and estimated from micrometeorological parameters produced by the WRF model. Seasonal and annual wind means were obtained from the surface wind data simulated by WRF for Ponta Grossa. The mean wind direction is assumed to be the most probable direction of bromeliad seed trajectory. Moreover, the atmospheric turbulence effect and dispersal distances were analyzed in order to identify likely regions of infestation around Ponta Grossa urban area. It is important to mention that this model could be applied to any species and local as long as seed’s biological data and meteorological data for the region of interest are available.Keywords: atmospheric turbulence, bromeliad, numerical model, seed dispersal, terminal velocity, wind
Procedia PDF Downloads 14122201 Analysis of Impact of Air Pollution over Megacity Delhi Due to Agricultural Biomass Burning in the Neighbouring States
Authors: Ankur P. Sati, Manju Mohan
Abstract:
The hazardous combination of smoke and pollutant gases, smog, is harmful for health. There are strong evidences that the Agricultural waste burning (AWB) in the Northern India leads to adverse air quality in Delhi and its surrounding regions. A severe smog episode was observed over Delhi, India during November 2012 which resulted in very low visibility and various respiratory problems. Very high values of pollutants (PM10 as high as 989 µg m-3, PM2.5 as high as 585 µg m-3 an NO2 as high as 540 µg m-3) were measured all over Delhi during the smog episode. Ultra Violet Aerosol Index (UVAI) from Aura satellite and Aerosol Optical Depth (AOD) are used in the present study along with the output trajectories from HYSPLIT model and the in-situ data. Satellite data also reveal that AOD, UVAI are always at its highest during the farmfires duration in Punjab region of India and the extent of these farmfires may be increasing. It is observed that during the smog episode all the AOD, UVAI, PM2.5 and PM10 values surpassed those of the Diwali period (one of the most polluted events in the city) by a considerable amount at all stations across Delhi. The parameters used from the remote sensing data and the ground based observations at various stations across Delhi are very well in agreement about the intensity of Smog episode. The analysis clearly shows that regional pollution can have greater contributions in deteriorating the air quality than the local under adverse meteorological conditions.Keywords: smog, farmfires, AOD, remote sensing
Procedia PDF Downloads 24522200 Algorithmic Obligations: Proactive Liability for AI-Generated Content and Copyright Compliance
Authors: Aleksandra Czubek
Abstract:
As AI systems increasingly shape content creation, existing copyright frameworks face significant challenges in determining liability for AI-generated outputs. Current legal discussions largely focus on who bears responsibility for infringing works, be it developers, users, or entities benefiting from AI outputs. This paper introduces a novel concept of algorithmic obligations, proposing that AI developers be subject to proactive duties that ensure their models prevent copyright infringement before it occurs. Building on principles of obligations law traditionally applied to human actors, the paper suggests a shift from reactive enforcement to proactive legal requirements. AI developers would be legally mandated to incorporate copyright-aware mechanisms within their systems, turning optional safeguards into enforceable standards. These obligations could vary in implementation across international, EU, UK, and U.S. legal frameworks, creating a multi-jurisdictional approach to copyright compliance. This paper explores how the EU’s existing copyright framework, exemplified by the Copyright Directive (2019/790), could evolve to impose a duty of foresight on AI developers, compelling them to embed mechanisms that prevent infringing outputs. By drawing parallels to GDPR’s “data protection by design,” a similar principle could be applied to copyright law, where AI models are designed to minimize copyright risks. In the UK, post-Brexit text and data mining exemptions are seen as pro-innovation but pose risks to copyright protections. This paper proposes a balanced approach, introducing algorithmic obligations to complement these exemptions. AI systems benefiting from text and data mining provisions should integrate safeguards that flag potential copyright violations in real time, ensuring both innovation and protection. In the U.S., where copyright law focuses on human-centric works, this paper suggests an evolution toward algorithmic due diligence. AI developers would have a duty similar to product liability, ensuring that their systems do not produce infringing outputs, even if the outputs themselves cannot be copyrighted. This framework introduces a shift from post-infringement remedies to preventive legal structures, where developers actively mitigate risks. The paper also breaks new ground by addressing obligations surrounding the training data of large language models (LLMs). Currently, training data is often treated under exceptions such as the EU’s text and data mining provisions or U.S. fair use. However, this paper proposes a proactive framework where developers are obligated to verify and document the legal status of their training data, ensuring it is licensed or otherwise cleared for use. In conclusion, this paper advocates for an obligations-centered model that shifts AI-related copyright law from reactive litigation to proactive design. By holding AI developers to a heightened standard of care, this approach aims to prevent infringement at its source, addressing both the outputs of AI systems and the training processes that underlie them.Keywords: ip, technology, copyright, data, infringement, comparative analysis
Procedia PDF Downloads 1922199 Descriptive Analysis of Community-Based Needs among Asylum Seekers in New England before and after COVID-19
Authors: Viknesh Kasthuri, Victoria Angenent-Mari, Jade Wexler
Abstract:
The COVID-19 pandemic dramatically altered the landscape of asylum medicine. Brown Human Rights Asylum Clinic (BHRAC) is a medical-student-run asylum clinic that provides pro-bono medical evaluations and forensic affidavits for individuals seeking asylum in New England. After the outbreak of COVID-19 in March 2020, BHRAC experienced numerous changes both in the number of clients requesting services as well as in the resource needs of these clients. Uniquely, BHRAC assesses the needs of clients during their affidavit interview and seeks to address these needs by connecting clients to local community organizations and resources. Data regarding the specific needs of clients range from 2019-present day. Analysis of internal BHRAC’s internal data suggested a small increase in requests for assistance with light and gas (from 5% of total resource requests pre-COVID to 11%), as well as a decrease in requests for mental health services (from 20% of resources pre-COVID to 13% post-COVID). Furthermore, BHRAC witnessed a decline in clinic volume during the second half of 2020. In short, our data suggest that the pandemic affected asylum seekers' access to medico-legal services and the resources they need. Future research with larger sample sizes and in other geographic locations is required to determine the holistic impact of the COVID-19 pandemic on asylum seekers.Keywords: asylum clinic, asylum medicine, COVID, social determinants of health
Procedia PDF Downloads 10322198 Entropy Analysis in a Bubble Column Based on Ultrafast X-Ray Tomography Data
Authors: Stoyan Nedeltchev, Markus Schubert
Abstract:
By means of the ultrafast X-ray tomography facility, data were obtained at different superficial gas velocities UG in a bubble column (0.1 m in ID) operated with an air-deionized water system at ambient conditions. Raw reconstructed images were treated by both the information entropy (IE) and the reconstruction entropy (RE) algorithms in order to identify the main transition velocities in a bubble column. The IE values exhibited two well-pronounced minima at UG=0.025 m/s and UG=0.085 m/s identifying the boundaries of the homogeneous, transition and heterogeneous regimes. The RE extracted from the central region of the column’s cross-section exhibited only one characteristic peak at UG=0.03 m/s, which was attributed to the transition from the homogeneous to the heterogeneous flow regime. This result implies that the transition regime is non-existent in the core of the column.Keywords: bubble column, ultrafast X-ray tomography, information entropy, reconstruction entropy
Procedia PDF Downloads 39122197 Beyond Personal Evidence: Using Learning Analytics and Student Feedback to Improve Learning Experiences
Authors: Shawndra Bowers, Allie Brandriet, Betsy Gilbertson
Abstract:
This paper will highlight how Auburn Online’s instructional designers leveraged student and faculty data to update and improve online course design and instructional materials. When designing and revising online courses, it can be difficult for faculty to know what strategies are most likely to engage learners and improve educational outcomes in a specific discipline. It can also be difficult to identify which metrics are most useful for understanding and improving teaching, learning, and course design. At Auburn Online, the instructional designers use a suite of data based student’s performance, participation, satisfaction, and engagement, as well as faculty perceptions, to inform sound learning and design principles that guide growth-mindset consultations with faculty. The consultations allow the instructional designer, along with the faculty member, to co-create an actionable course improvement plan. Auburn Online gathers learning analytics from a variety of sources that any instructor or instructional design team may have access to at their own institutions. Participation and performance data, such as page: views, assignment submissions, and aggregate grade distributions, are collected from the learning management system. Engagement data is pulled from the video hosting platform, which includes unique viewers, views and downloads, the minutes delivered, and the average duration each video is viewed. Student satisfaction is also obtained through a short survey that is embedded at the end of each instructional module. This survey is included in each course every time it is taught. The survey data is then analyzed by an instructional designer for trends and pain points in order to identify areas that can be modified, such as course content and instructional strategies, to better support student learning. This analysis, along with the instructional designer’s recommendations, is presented in a comprehensive report to instructors in an hour-long consultation where instructional designers collaborate with the faculty member on how and when to implement improvements. Auburn Online has developed a triage strategy of priority 1 or 2 level changes that will be implemented in future course iterations. This data-informed decision-making process helps instructors focus on what will best work in their teaching environment while addressing which areas need additional attention. As a student-centered process, it has created improved learning environments for students and has been well received by faculty. It has also shown to be effective in addressing the need for improvement while removing the feeling the faculty’s teaching is being personally attacked. The process that Auburn Online uses is laid out, along with the three-tier maintenance and revision guide that will be used over a three-year implementation plan. This information can help others determine what components of the maintenance and revision plan they want to utilize, as well as guide them on how to create a similar approach. The data will be used to analyze, revise, and improve courses by providing recommendations and models of good practices through determining and disseminating best practices that demonstrate an impact on student success.Keywords: data-driven, improvement, online courses, faculty development, analytics, course design
Procedia PDF Downloads 6122196 The Effectiveness of Teaching Emotional Intelligence on Reducing Marital Conflicts and Marital Adjustment in Married Students of Tehran University
Authors: Elham Jafari
Abstract:
The aim of this study was to evaluate the effectiveness of emotional intelligence training on reducing marital conflict and marital adjustment in married students of the University of Tehran. This research is an applied type in terms of purpose and a semi-experimental design of pre-test-post-test type with the control group and with follow-up test in terms of the data collection method. The statistical population of the present study consisted of all married students of the University of Tehran. In this study, 30 married students of the University of Tehran were selected by convenience sampling method as a sample that 15 people in the experimental group and 15 people in the control group were randomly selected. The method of data collection in this research was field and library. The data collection tool in the field section was two questionnaires of marital conflict and marital adjustment. To analyze the collected data, first at the descriptive level, using statistical indicators, the demographic characteristics of the sample were described by SPSS software. In inferential statistics, the statistical method used was the test of analysis of covariance. The results showed that the effect of the independent variable of emotional intelligence on the reduction of marital conflicts is statistically significant. And it can be inferred that emotional intelligence training has reduced the marital conflicts of married students of the University of Tehran in the experimental group compared to the control group. Also, the effect of the independent variable of emotional intelligence on marital adjustment was statistically significant. It can be inferred that emotional intelligence training has adjusted the marital adjustment of married students of the University of Tehran in the experimental group compared to the control group.Keywords: emotional intelligence, marital conflicts, marital compatibility, married students
Procedia PDF Downloads 25122195 Secure Network Coding-Based Named Data Network Mutual Anonymity Transfer Protocol
Authors: Tao Feng, Fei Xing, Ye Lu, Jun Li Fang
Abstract:
NDN is a kind of future Internet architecture. Due to the NDN design introduces four privacy challenges,Many research institutions began to care about the privacy issues of naming data network(NDN).In this paper, we are in view of the major NDN’s privacy issues to investigate privacy protection,then put forwards more effectively anonymous transfer policy for NDN.Firstly,based on mutual anonymity communication for MP2P networks,we propose NDN mutual anonymity protocol.Secondly,we add interest package authentication mechanism in the protocol and encrypt the coding coefficient, security of this protocol is improved by this way.Finally, we proof the proposed anonymous transfer protocol security and anonymity.Keywords: NDN, mutual anonymity, anonymous routing, network coding, authentication mechanism
Procedia PDF Downloads 451