Search results for: data source
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28350

Search results for: data source

27120 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 278
27119 Density Determination of Liquid Niobium by Means of Ohmic Pulse-Heating for Critical Point Estimation

Authors: Matthias Leitner, Gernot Pottlacher

Abstract:

Experimental determination of critical point data like critical temperature, critical pressure, critical volume and critical compressibility of high-melting metals such as niobium is very rare due to the outstanding experimental difficulties in reaching the necessary extreme temperature and pressure regimes. Experimental techniques to achieve such extreme conditions could be diamond anvil devices, two stage gas guns or metal samples hit by explosively accelerated flyers. Electrical pulse-heating under increased pressures would be another choice. This technique heats thin wire samples of 0.5 mm diameter and 40 mm length from room temperature to melting and then further to the end of the stable phase, the spinodal line, within several microseconds. When crossing the spinodal line, the sample explodes and reaches the gaseous phase. In our laboratory, pulse-heating experiments can be performed under variation of the ambient pressure from 1 to 5000 bar and allow a direct determination of critical point data for low-melting, but not for high-melting metals. However, the critical point also can be estimated by extrapolating the liquid phase density according to theoretical models. A reasonable prerequisite for the extrapolation is the existence of data that cover as much as possible of the liquid phase and at the same time exhibit small uncertainties. Ohmic pulse-heating was therefore applied to determine thermal volume expansion, and from that density of niobium over the entire liquid phase. As a first step, experiments under ambient pressure were performed. The second step will be to perform experiments under high-pressure conditions. During the heating process, shadow images of the expanding sample wire were captured at a frame rate of 4 × 105 fps to monitor the radial expansion as a function of time. Simultaneously, the sample radiance was measured with a pyrometer operating at a mean effective wavelength of 652 nm. To increase the accuracy of temperature deduction, spectral emittance in the liquid phase is also taken into account. Due to the high heating rates of about 2 × 108 K/s, longitudinal expansion of the wire is inhibited which implies an increased radial expansion. As a consequence, measuring the temperature dependent radial expansion is sufficient to deduce density as a function of temperature. This is accomplished by evaluating the full widths at half maximum of the cup-shaped intensity profiles that are calculated from each shadow image of the expanding wire. Relating these diameters to the diameter obtained before the pulse-heating start, the temperature dependent volume expansion is calculated. With the help of the known room-temperature density, volume expansion is then converted into density data. The so-obtained liquid density behavior is compared to existing literature data and provides another independent source of experimental data. In this work, the newly determined off-critical liquid phase density was in a second step utilized as input data for the estimation of niobium’s critical point. The approach used, heuristically takes into account the crossover from mean field to Ising behavior, as well as the non-linearity of the phase diagram’s diameter.

Keywords: critical point data, density, liquid metals, niobium, ohmic pulse-heating, volume expansion

Procedia PDF Downloads 219
27118 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management

Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang

Abstract:

Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.

Keywords: construction supply chain management, BIM, data exchange, artificial intelligence

Procedia PDF Downloads 26
27117 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 428
27116 Municipal Solid Waste Management Using Life Cycle Assessment Approach: Case Study of Maku City, Iran

Authors: L. Heidari, M. Jalili Ghazizade

Abstract:

This paper aims to determine the best environmental and economic scenario for Municipal Solid Waste (MSW) management of the Maku city by using Life Cycle Assessment (LCA) approach. The functional elements of this study are collection, transportation, and disposal of MSW in Maku city. Waste composition and density, as two key parameters of MSW, have been determined by field sampling, and then, the other important specifications of MSW like chemical formula, thermal energy and water content were calculated. These data beside other information related to collection and disposal facilities are used as a reliable source of data to assess the environmental impacts of different waste management options, including landfills, composting, recycling and energy recovery. The environmental impact of MSW management options has been investigated in 15 different scenarios by Integrated Waste Management (IWM) software. The photochemical smog, greenhouse gases, acid gases, toxic emissions, and energy consumption of each scenario are measured. Then, the environmental indices of each scenario are specified by weighting these parameters. Economic costs of scenarios have been also compared with each other based on literature. As final result, since the organic materials make more than 80% of the waste, compost can be a suitable method. Although the major part of the remaining 20% of waste can be recycled, due to the high cost of necessary equipment, the landfill option has been suggested. Therefore, the scenario with 80% composting and 20% landfilling is selected as superior environmental and economic scenario. This study shows that, to select a scenario with practical applications, simultaneously environmental and economic aspects of different scenarios must be considered.

Keywords: IWM software, life cycle assessment, Maku, municipal solid waste management

Procedia PDF Downloads 238
27115 A Method to Evaluate and Compare Web Information Extractors

Authors: Patricia Jiménez, Rafael Corchuelo, Hassan A. Sleiman

Abstract:

Web mining is gaining importance at an increasing pace. Currently, there are many complementary research topics under this umbrella. Their common theme is that they all focus on applying knowledge discovery techniques to data that is gathered from the Web. Sometimes, these data are relatively easy to gather, chiefly when it comes from server logs. Unfortunately, there are cases in which the data to be mined is the data that is displayed on a web document. In such cases, it is necessary to apply a pre-processing step to first extract the information of interest from the web documents. Such pre-processing steps are performed using so-called information extractors, which are software components that are typically configured by means of rules that are tailored to extracting the information of interest from a web page and structuring it according to a pre-defined schema. Paramount to getting good mining results is that the technique used to extract the source information is exact, which requires to evaluate and compare the different proposals in the literature from an empirical point of view. According to Google Scholar, about 4 200 papers on information extraction have been published during the last decade. Unfortunately, they were not evaluated within a homogeneous framework, which leads to difficulties to compare them empirically. In this paper, we report on an original information extraction evaluation method. Our contribution is three-fold: a) this is the first attempt to provide an evaluation method for proposals that work on semi-structured documents; the little existing work on this topic focuses on proposals that work on free text, which has little to do with extracting information from semi-structured documents. b) It provides a method that relies on statistically sound tests to support the conclusions drawn; the previous work does not provide clear guidelines or recommend statistically sound tests, but rather a survey that collects many features to take into account as well as related work; c) We provide a novel method to compute the performance measures regarding unsupervised proposals; otherwise they would require the intervention of a user to compute them by using the annotations on the evaluation sets and the information extracted. Our contributions will definitely help researchers in this area make sure that they have advanced the state of the art not only conceptually, but from an empirical point of view; it will also help practitioners make informed decisions on which proposal is the most adequate for a particular problem. This conference is a good forum to discuss on our ideas so that we can spread them to help improve the evaluation of information extraction proposals and gather valuable feedback from other researchers.

Keywords: web information extractors, information extraction evaluation method, Google scholar, web

Procedia PDF Downloads 248
27114 Computer Self-Efficacy, Study Behaviour and Use of Electronic Information Resources in Selected Polytechnics in Ogun State, Nigeria

Authors: Fredrick Olatunji Ajegbomogun, Bello Modinat Morenikeji, Okorie Nancy Chituru

Abstract:

Electronic information resources are highly relevant to students' academic and research needs but are grossly underutilized, despite the institutional commitment to making them available. The under-utilisation of these resources could be attributed to a low level of study behaviour coupled with a low level of computer self-efficacy. This study assessed computer self-efficacy, study behaviour, and the use of electronic information resources by students in selected polytechnics in Ogun State. A simple random sampling technique using Krejcie and Morgan's (1970) Table was used to select 370 respondents for the study. A structured questionnaire was used to collect data on respondents. Data were analysed using frequency counts, percentages, mean, standard deviation, Pearson Product Moment Correlation (PPMC) and multiple regression analysis. Results reveal that the internet (= 1.94), YouTube (= 1.74), and search engines (= 1.72) were the common information resources available to the students, while the Internet (= 4.22) is the most utilized resource. Major reasons for using electronic information resources were to source materials and information (= 3.30), for research (= 3.25), and to augment class notes (= 2.90). The majority (91.0%) of the respondents have a high level of computer self-efficacy in the use of electronic information resources through selecting from screen menus (= 3.12), using data files ( = 3.10), and efficient use of computers (= 3.06). Good preparation for tests (= 3.27), examinations (= 3.26), and organization of tutorials (= 3.11) are the common study behaviours of the respondents. Overall, 93.8% have good study behaviour. Inadequate computer facilities to access information (= 3.23), and poor internet access (= 2.87) were the major challenges confronting students’ use of electronic information resources. According to the PPMC results, study behavior (r = 0.280) and computer self-efficacy (r = 0.304) have significant (p 0.05) relationships with the use of electronic information resources. Regression results reveal that self-efficacy (=0.214) and study behavior (=0.122) positively (p 0.05) influenced students' use of electronic information resources. The study concluded that students' use of electronic information resources depends on the purpose, their computer self-efficacy, and their study behaviour. Therefore, the study recommended that the management should encourage the students to improve their study habits and computer skills, as this will enhance their continuous and more effective utilization of electronic information resources.

Keywords: computer self-efficacy, study behaviour, electronic information resources, polytechnics, Nigeria

Procedia PDF Downloads 120
27113 Data Mining As A Tool For Knowledge Management: A Review

Authors: Maram Saleh

Abstract:

Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.

Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.

Procedia PDF Downloads 208
27112 The 2017 Summer Campaign for Night Sky Brightness Measurements on the Tuscan Coast

Authors: Andrea Giacomelli, Luciano Massetti, Elena Maggi, Antonio Raschi

Abstract:

The presentation will report the activities managed during the Summer of 2017 by a team composed by staff from a University Department, a National Research Council Institute, and an outreach NGO, collecting measurements of night sky brightness and other information on artificial lighting, in order to characterize light pollution issues on portions of the Tuscan coast, in Central Italy. These activities combine measurements collected by the principal scientists, citizen science observations led by students, and outreach events targeting a broad audience. This campaign aggregates the efforts of three actors: the BuioMetria Partecipativa project, which started collecting light pollution data on a national scale in 2008 with an environmental engineering and free/open source GIS core team; the Institute of Biometeorology from the National Research Council, with ongoing studies on light and urban vegetation and a consolidated track record in environmental education and citizen science; the Department of Biology from the University of Pisa, which started experiments to assess the impact of light pollution in coastal environments in 2015. While the core of the activities concerns in situ data, the campaign will account also for remote sensing data, thus considering heterogeneous data sources. The aim of the campaign is twofold: (1) To test actions of citizen and student engagement in monitoring sky brightness (2) To collect night sky brightness data and test a protocol for applications to studies on the ecological impact of light pollution, with a special focus on marine coastal ecosystems. The collaboration of an interdisciplinary team in the study of artificial lighting issues is not a common case in Italy, and the possibility of undertaking the campaign in Tuscany has the added value of operating in one of the territories where it is possible to observe both sites with extremely high lighting levels, and areas with extremely low light pollution, especially in the Southern part of the region. Combining environmental monitoring and communication actions in the context of the campaign, this effort will contribute to the promotion of night skies with a good quality as an important asset for the sustainability of coastal ecosystems, as well as to increase citizen awareness through star gazing, night photography and actively participating in field campaign measurements.

Keywords: citizen science, light pollution, marine coastal biodiversity, environmental education

Procedia PDF Downloads 173
27111 LIZTOXD: Inclusive Lizard Toxin Database by Using MySQL Protocol

Authors: Iftikhar A. Tayubi, Tabrej Khan, Mansoor M. Alsubei, Fahad A. Alsaferi

Abstract:

LIZTOXD provides a single source of high-quality information about proteinaceous lizard toxins that will be an invaluable resource for pharmacologists, neuroscientists, toxicologists, medicinal chemists, ion channel scientists, clinicians, and structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to explore the detail information of Lizard and toxin proteins. It includes common name, scientific name, entry id, entry name, protein name and length of the protein sequence. The utility of this database is that it can provide a user-friendly interface for users to retrieve the information about Lizard, toxin and toxin protein of different Lizard species. These interfaces created in this database will satisfy the demands of the scientific community by providing in-depth knowledge about Lizard and its toxin. In the next phase of our project we will adopt methodology and by using A MySQL and Hypertext Preprocessor (PHP) which and for designing Smart Draw. A database is a wonderful piece of equipment for storing large quantities of data efficiently. The users can thus navigate from one section to another, depending on the field of interest of the user. This database contains a wealth of information on species, toxins, toxins, clinical data etc. LIZTOXD resource that provides comprehensive information about protein toxins from lizard toxins. The combination of specific classification schemes and a rich user interface allows researchers to easily locate and view information on the sequence, structure, and biological activity of these toxins. This manually curated database will be a valuable resource for both basic researchers as well as those interested in potential pharmaceutical and agricultural applications of lizard toxins.

Keywords: LIZTOXD, MySQL, PHP, smart draw

Procedia PDF Downloads 162
27110 Developing a GIS-Based Tool for the Management of Fats, Oils, and Grease (FOG): A Case Study of Thames Water Wastewater Catchment

Authors: Thomas D. Collin, Rachel Cunningham, Bruce Jefferson, Raffaella Villa

Abstract:

Fats, oils and grease (FOG) are by-products of food preparation and cooking processes. FOG enters wastewater systems through a variety of sources such as households, food service establishments, and industrial food facilities. Over time, if no source control is in place, FOG builds up on pipe walls, leading to blockages, and potentially to sewer overflows which are a major risk to the Environment and Human Health. UK water utilities spend millions of pounds annually trying to control FOG. Despite UK legislation specifying that discharge of such material is against the law, it is often complicated for water companies to identify and prosecute offenders. Hence, it leads to uncertainties regarding the attitude to take in terms of FOG management. Research is needed to seize the full potential of implementing current practices. The aim of this research was to undertake a comprehensive study to document the extent of FOG problems in sewer lines and reinforce existing knowledge. Data were collected to develop a model estimating quantities of FOG available for recovery within Thames Water wastewater catchments. Geographical Information System (GIS) software was used in conjunction to integrate data with a geographical component. FOG was responsible for at least 1/3 of sewer blockages in Thames Water waste area. A waste-based approach was developed through an extensive review to estimate the potential for FOG collection and recovery. Three main sources were identified: residential, commercial and industrial. Commercial properties were identified as one of the major FOG producers. The total potential FOG generated was estimated for the 354 wastewater catchments. Additionally, raw and settled sewage were sampled and analysed for FOG (as hexane extractable material) monthly at 20 sewage treatment works (STW) for three years. A good correlation was found with the sampled FOG and population equivalent (PE). On average, a difference of 43.03% was found between the estimated FOG (waste-based approach) and sampled FOG (raw sewage sampling). It was suggested that the approach undertaken could overestimate the FOG available, the sampling could only capture a fraction of FOG arriving at STW, and/or the difference could account for FOG accumulating in sewer lines. Furthermore, it was estimated that on average FOG could contribute up to 12.99% of the primary sludge removed. The model was further used to investigate the relationship between estimated FOG and number of blockages. The higher the FOG potential, the higher the number of FOG-related blockages is. The GIS-based tool was used to identify critical areas (i.e. high FOG potential and high number of FOG blockages). As reported in the literature, FOG was one of the main causes of sewer blockages. By identifying critical areas (i.e. high FOG potential and high number of FOG blockages) the model further explored the potential for source-control in terms of ‘sewer relief’ and waste recovery. Hence, it helped targeting where benefits from implementation of management strategies could be the highest. However, FOG is still likely to persist throughout the networks, and further research is needed to assess downstream impacts (i.e. at STW).

Keywords: fat, FOG, GIS, grease, oil, sewer blockages, sewer networks

Procedia PDF Downloads 209
27109 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 53
27108 Microgravity, Hydrological and Metrological Monitoring of Shallow Ground Water Aquifer in Al-Ain, UAE

Authors: Serin Darwish, Hakim Saibi, Amir Gabr

Abstract:

The United Arab Emirates (UAE) is situated within an arid zone where the climate is arid and the recharge of the groundwater is very low. Groundwater is the primary source of water in the United Arab Emirates. However, rapid expansion, population growth, agriculture, and industrial activities have negatively affected these limited water resources. The shortage of water resources has become a serious concern due to the over-pumping of groundwater to meet demand. In addition to the deficit of groundwater, the UAE has one of the highest per capita water consumption rates in the world. In this study, a combination of time-lapse measurements of microgravity and depth to groundwater level in selected wells in Al Ain city was used to estimate the variations in groundwater storage. Al-Ain is the second largest city in Abu Dhabi Emirates and the third largest city in the UAE. The groundwater in this region has been overexploited. Relative gravity measurements were acquired using the Scintrex CG-6 Autograv. This latest generation gravimeter from Scintrex Ltd provides fast, precise gravity measurements and automated corrections for temperature, tide, instrument tilt and rejection of data noise. The CG-6 gravimeter has a resolution of 0.1μGal. The purpose of this study is to measure the groundwater storage changes in the shallow aquifers based on the application of microgravity method. The gravity method is a nondestructive technique that allows collection of data at almost any location over the aquifer. Preliminary results indicate a possible relationship between microgravity and water levels, but more work needs to be done to confirm this. The results will help to develop the relationship between monthly microgravity changes with hydrological and hydrogeological changes of shallow phreatic. The study will be useful in water management considerations and additional future investigations.

Keywords: Al-Ain, arid region, groundwater, microgravity

Procedia PDF Downloads 152
27107 Reduce, Reuse and Recycle: Grand Challenges in Construction Recovery Process

Authors: Abioye A. Oyenuga, Rao Bhamidiarri

Abstract:

Hurling a successful Construction and Demolition Waste (C&DW) recycling operation around the globe is a challenge today, predominantly because secondary materials markets are yet to be integrated. Reducing, Reusing and recycling of (C&DW) have been employed over the years, and various techniques have been investigated. However, the economic and environmental viability of its application seems limited. This paper discusses the costs and benefits in using secondary materials and focus on investigating reuse and recycling process for five major types of construction materials: concrete, metal, wood, cardboard/paper, and plasterboard. Data obtained from demolition specialist and contractors are considered and evaluated. With the date source, the research paper found that construction material recovery process fully incorporate the 3R’s process and shows how energy recovery by means of 3R's principles can be evaluated. This scrutiny leads to the empathy of grand challenges in construction material recovery process. Recommendations to deepen material recovery process are also discussed.

Keywords: construction and demolition waste (C&DW), 3R concept, recycling, reuse, waste management, UK

Procedia PDF Downloads 428
27106 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 480
27105 Parallel Tracking and Mapping of a Fleet of Quad-Rotor

Authors: M. Bazin, I. Bouguir, D. Combe, V. Germain, G. Lassade

Abstract:

The problem of managing a fleet of quad-rotor drones in a completely unknown environment is analyzed in the present paper. This work is following the footsteps of other studies about how should be managed the movements of a swarm of elements that have to stay gathered throughout their activities. In this paper we aim to demonstrate the limitations of a system where absolutely all the calculations and physical movements of our elements are done by one single external element. The strategy of control is an adaptive approach which takes into account the explored environment. This is made possible thanks to a set of command rules which can guide the drones through various missions with defined goal. The result of the mission is independent of the nature of environment and the number of drones in the fleet. This strategy is based on a simultaneous usage of different data: obstacles positions, real-time positions of all drones and relative positions between the different drones. The present work is made with the Robot Operating System and used several open-source projects on localization and usage of drones.

Keywords: cooperative guidance, distributed control, unmanned aerial vehicle, obstacle avoidance

Procedia PDF Downloads 304
27104 Comparison of Hydrogen and Electrification Perspectives in Decarbonizing the Transport Sector

Authors: Matteo Nicoli, Gianvito Colucci, Valeria Di Cosmo, Daniele Lerede, Laura Savoldi

Abstract:

The transport sector is currently responsible for approximately 1/3 of greenhouse gas emissions in Europe. In the wider context of achieving carbon neutrality of the global energy system, different alternatives are available to decarbonizethe transport sector. In particular, while electricity is already the most consumed energy commodity in rail transport, battery electric vehicles are one of the zero-emissions options on the market for road transportation. On the other hand, hydrogen-based fuel cell vehicles are available for road and non-road vehicles. The European Commission is strongly pushing toward the integration of hydrogen in the energy systems of European countries and its widespread adoption as an energy vector to achieve the Green Deal targets. Furthermore, the Italian government is defining hydrogen-related objectives with the publication of a dedicated Hydrogen Strategy. The adoption of energy system optimization models to study the possible penetration of alternative zero-emitting transport technologies gives the opportunity to perform an overall analysis of the effects that the development of innovative technologies has on the entire energy system and on the supply-side, devoted to the production of energy carriers such as hydrogen and electricity. Using an open-source modeling framework such as TEMOA, this work aims to compare the role of hydrogen and electric vehicles in the decarbonization of the transport sector. The analysis investigates the advantages and disadvantages of adopting the two options, from the economic point of view (costs associated with the two options) and the environmental one (looking at the emissions reduction perspectives). Moreover, an analysis on the profitability of the investments in hydrogen and electric vehicles will be performed. The study investigates the evolution of energy consumption and greenhouse gas emissions in different transportation modes (road, rail, navigation, and aviation) by detailed analysis of the full range of vehicles included in the techno-economic database used in the TEMOA model instance adopted for this work. The transparency of the analysis is guaranteed by the accessibility of the TEMOA models, based on an open-access source code and databases.

Keywords: battery electric vehicles, decarbonization, energy system optimization models, fuel cell vehicles, hydrogen, open-source modeling, TEMOA, transport

Procedia PDF Downloads 112
27103 In vitro Antioxidant Scavenging of Root Fraction of Bryonia dioica

Authors: Yamani Amal, Lazaae Jamila, Elachouri Mostafa

Abstract:

Plants and their active agents – especially polyphenols – may have a principal role in the treatment of diseases that result from the defect of physiological antioxidant mechanisms. Bryonia dioica is well known in Moroccan traditional medicine for alleviatin pain and traiting many diseases. We have focused on plant belonging to Cucurbitaceae Family from around the world to understand their therapeutic uses and their potential antioxidant activities Although several biological activities and Chemical composition of Bryonia dioica are well characterized, no direct, in vitro study, of this natural product examined the antioxydant effect of the extract from the roots of Bryonia dioica. The aim of this study was to determine in vitro antioxidant activity of the B.dioica root, using antioxidant analysis methods based on determination of Hydroxyradical Scavenging, 1,1-diphenyl-2-picrylhydrazine (DPPH) radical scavenging, Hydrogenperoxide Scavenging and Nitric Oxide Scavenging. In this study, it was demonstrated, that, B. dioica root extract showed excellent antioxidant properties. This investigation showed that the roots of this plant contain potent natural scavengers R. It may represent an interesting source of antioxidant phenolics that may favour the extension of their cultivation as new source of natural antioxidants in addition to containing high quality proteins for human or animal nutrition. Therefore, there is need for all stakeholders on the Morocco to strive towards taking advantage of our enormous biodiversity resources to free our people from diseases, abject poverty and stagnation.

Keywords: Morocco, bryoniadioica, in vitro, antioxydant

Procedia PDF Downloads 384
27102 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 387
27101 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management

Authors: Kenneth Harper

Abstract:

The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.

Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups

Procedia PDF Downloads 27
27100 Communicating Meaning through Translanguaging: The Case of Multilingual Interactions of Algerians on Facebook

Authors: F. Abdelhamid

Abstract:

Algeria is a multilingual speech community where individuals constantly mix between codes in spoken discourse. Code is used as a cover term to refer to the existing languages and language varieties which include, among others, the mother tongue of the majority Algerian Arabic, the official language Modern Standard Arabic and the foreign languages French and English. The present study explores whether Algerians mix between these codes in online communication as well. Facebook is the selected platform from which data is collected because it is the preferred social media site for most Algerians and it is the most used one. Adopting the notion of translanguaging, this study attempts explaining how users of Facebook use multilingual messages to communicate meaning. Accordingly, multilingual interactions are not approached from a pejorative perspective but rather as a creative linguistic behavior that multilingual utilize to achieve intended meanings. The study is intended as a contribution to the research on multilingualism online because although an extensive literature has investigated multilingualism in spoken discourse, limited research investigated it in the online one. Its aim is two-fold. First, it aims at ensuring that the selected platform for analysis, namely Facebook, could be a source for multilingual data to enable the qualitative analysis. This is done by measuring frequency rates of multilingual instances. Second, when enough multilingual instances are encountered, it aims at describing and interpreting some selected ones. 120 posts and 16335 comments were collected from two Facebook pages. Analysis revealed that third of the collected data are multilingual messages. Users of Facebook mixed between the four mentioned codes in writing their messages. The most frequent cases are mixing between Algerian Arabic and French and between Algerian Arabic and Modern Standard Arabic. A focused qualitative analysis followed where some examples are interpreted and explained. It seems that Algerians mix between codes when communicating online despite the fact that it is a conscious type of communication. This suggests that such behavior is not a random and corrupted way of communicating but rather an intentional and natural one.

Keywords: Algerian speech community, computer mediated communication, languages in contact, multilingualism, translanguaging

Procedia PDF Downloads 131
27099 Study on Surface Morphology and Reflectance of Solar Cells Applied in Pyramid Structures

Authors: Zong-Sheng Chen

Abstract:

With the advancement of technology, human activities have increased greenhouse gas emissions and fossil fuel energy production, leading to increasingly severe global warming. To mitigate global warming, energy conservation and carbon reduction have become global goals. Solar energy, a renewable energy source, not only helps achieve energy conservation and carbon reduction but also serves as an efficient energy generation method. Solar energy, derived from sunlight, is an endless and promising energy source capable of meeting high energy demands sustainably. In recent years, many countries around the world have been developing the solar energy industry, and Taiwan is no exception. Positioned in the subtropical region, Taiwan possesses geographical advantages conducive to solar energy utilization. Furthermore, Taiwan's well-developed semiconductor technology and sophisticated equipment make it highly suitable for the development of high-efficiency solar cells. This study focuses on investigating the anti-reflection properties of solar cells. Through metal-assisted chemical etching, pyramid structures are etched to allow sunlight to pass through, achieving secondary or higher-order reflections on the surface of these structures. This trapping of light within the substrate reduces reflection rates and increases conversion efficiency.

Keywords: solar cell, reflectance, pyramidal structure, potassium hydroxide

Procedia PDF Downloads 67
27098 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 516
27097 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 379
27096 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 94
27095 Occurrence and Geological Setting of the Black Shales Outcrops in Malaysia

Authors: Hassan M. Baioumy, Yuniarti Ulfa

Abstract:

Paleozoic, Mesozoic and Cenozoic black shales that can be a potential source of energy and precious metals are widely distributed in Malaysia Peninsula, Sarawak and Sabah. Two Paleozoic black shales outcrops were reported in the Langkawi Island belonging to the Cambrian fluvial Machinchang Formation and the Silurian glaciomarine Singa Formation. More the seventeen occurrences of Paleozoic black shales outcrops have been found in the Peninsular Malaysia that range in age from Devonian, Carboniferous, and Permian in the Terengganu, Perlis, Pahang, and Perak States. Mesozoic black shales outcrops occur in several places in both the Peninsular Malaysia and Sarawak. In the Peninsular Malaysia, Triassic black shales occur in the Nami area, Northern Kedah and in the Pahang area. In Sarawak, Triassic black shales have been reported in the Bau area. Cenozoic black shales outcrops were reported in both Sarawak at Miri area and Sabah at the Ranau and Tenom areas. Preliminary mineralogical and geochemical investigations on some of these black shales outcrops showed distinct compositional variations among these black shales outcrops probably due to variations in their source area composition and/or depositional and diagenetic settings of these shales. Some of these shalese also subjected to post-depositional hydrothermal mineralization that enriched these shales with Au-bearing minerals such as pyrite, calchopyrite, and arsenopyrite. Many of the studied black shales outcrops look rich in organic matter, which increase the possibility of using these black shales as an unconventional energy resource.

Keywords: black shales, energy, mineralization, Malaysia

Procedia PDF Downloads 428
27094 Intelligent Software Architecture and Automatic Re-Architecting Based on Machine Learning

Authors: Gebremeskel Hagos Gebremedhin, Feng Chong, Heyan Huang

Abstract:

Software system is the combination of architecture and organized components to accomplish a specific function or set of functions. A good software architecture facilitates application system development, promotes achievement of functional requirements, and supports system reconfiguration. We describe three studies demonstrating the utility of our architecture in the subdomain of mobile office robots and identify software engineering principles embodied in the architecture. The main aim of this paper is to analyze prove architecture design and automatic re-architecting using machine learning. Intelligence software architecture and automatic re-architecting process is reorganizing in to more suitable one of the software organizational structure system using the user access dataset for creating relationship among the components of the system. The 3-step approach of data mining was used to analyze effective recovery, transformation and implantation with the use of clustering algorithm. Therefore, automatic re-architecting without changing the source code is possible to solve the software complexity problem and system software reuse.

Keywords: intelligence, software architecture, re-architecting, software reuse, High level design

Procedia PDF Downloads 119
27093 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 244
27092 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 478
27091 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 303