Search results for: data analyses
25369 Power HEMTs Transistors for Radar Applications
Authors: A. boursali, A. Guen Bouazza, M. Khaouani, Z. Kourdi, B. Bouazza
Abstract:
This paper presents the design, development and characterization of the devices simulation for X-Band Radar applications. The effect of an InAlN/GaN structure on the RF performance High Electron Mobility Transistor (HEMT) device. Systematic investigations on the small signal as well as power performance as functions of the drain biases are presented. Were improved for X-band applications. The Power Added Efficiency (PAE) was achieved over 23% for X-band. The developed devices combine two InAlN/GaN HEMTs of 30nm gate periphery and exhibited the output power of over 50W. An InAlN/GaN HEMT with 30nm gate periphery was developed and exhibited the output power of over 120W.Keywords: InAlN/GaN, HEMT, RF analyses, PAE, X-Band, radar
Procedia PDF Downloads 56025368 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa
Authors: Samy A. Khalil, U. Ali Rahoma
Abstract:
The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa
Procedia PDF Downloads 9825367 Exploring Communities of Practice through Public Health Walks for Nurse Education
Authors: Jacqueline P. Davies
Abstract:
Introduction: Student nurses must develop skills in observation, communication and reflection as well as public health knowledge from their first year of training. This paper will explain a method developed for students to collect their own findings about public health in urban areas. These areas are both rich in the history of old public health that informs the content of many traditional public health walks, but are also locations where new public health concerns about chronic disease are concentrated. The learning method explained in this paper enables students to collect their own data and write original work as first year students. Examples of their findings will be given. Methodology: In small groups, health care students are instructed to walk in neighbourhoods near to the hospitals they will soon attend as apprentice nurses. On their walks, they wander slowly, engage in conversations, and enter places open to the public. As they drift, they observe with all five senses in the real three dimensional world to collect data for their reflective accounts of old and new public health. They are encouraged to stop for refreshments and taste, as well as look, hear, smell, and touch while on their walk. They reflect as a group and later develop an individual reflective account in which they write up their deep reflections about what they observed on their walk. In preparation for their walk, they are encouraged to look at studies of quality of Life and other neighbourhood statistics as well as undertaking a risk assessment for their walk. Findings: Reflecting on their walks, students apply theoretical concepts around social determinants of health and health inequalities to develop their understanding of communities in the neighbourhoods visited. They write about the treasured historical architecture made of stone, bronze and marble which have outlived those who built them; but also how the streets are used now. The students develop their observations into thematic analyses such as: what we drink as illustrated by the empty coke can tossed into a now disused drinking fountain; the shift in home-life balance illustrated by streets where families once lived over the shop which are now walked by commuters weaving around each other as they talk on their mobile phones; and security on the street, with CCTV cameras placed at regular intervals, signs warning trespasses and barbed wire; but little evidence of local people watching the street. Conclusion: In evaluations of their first year, students have reported the health walk as one of their best experiences. The innovative approach was commended by the UK governing body of nurse education and it received a quality award from the nurse education funding body. This approach to education allows students to develop skills in the real world and write original work.Keywords: education, innovation, nursing, urban
Procedia PDF Downloads 28725366 A Regression Model for Residual-State Creep Failure
Authors: Deepak Raj Bhat, Ryuichi Yatabe
Abstract:
In this study, a residual-state creep failure model was developed based on the residual-state creep test results of clayey soils. To develop the proposed model, the regression analyses were done by using the R. The model results of the failure time (tf) and critical displacement (δc) were compared with experimental results and found in close agreements to each others. It is expected that the proposed regression model for residual-state creep failure will be more useful for the prediction of displacement of different clayey soils in the future.Keywords: regression model, residual-state creep failure, displacement prediction, clayey soils
Procedia PDF Downloads 40825365 DNA of Hibiscus sabdariffa Damaged by Radiation from 900 MHz GSM Antenna
Authors: A. O. Oluwajobi, O. A. Falusi, N. A. Zubbair, T. Owoeye, F. Ladejobi, M. C. Dangana, A. Abubakar
Abstract:
The technology of mobile telephony has positively enhanced human life and reports on the bio safety of the radiation from their antennae have been contradictory, leading to serious litigations and violent protests by residents in several parts of the world. The crave for more information, as requested by WHO in order to resolve this issue, formed the basis for this study on the effect of the radiation from 900 MHz GSM antenna on the DNA of Hibiscus sabdariffa. Seeds of H. sabdariffa were raised in pots placed in three replicates at 100, 200, 300 and 400 metres from the GSM antennae in three selected test locations and a control where there was no GSM signal. Temperature (˚C) and the relative humidity (%) of study sites were measured for the period of study (24 weeks). Fresh young leaves were harvested from each plant at two, eight and twenty-four weeks after sowing and the DNA extracts were subjected to RAPD-PCR analyses. There were no significant differences between the weather conditions (temperature and relative humidity) in all the study locations. However, significant differences were observed in the intensities of radiations between the control (less than 0.02 V/m) and the test (0.40-1.01 V/m) locations. Data obtained showed that DNA of samples exposed to rays from GSM antenna had various levels of distortions, estimated at 91.67%. Distortions occurred in 58.33% of the samples between 2-8 weeks of exposure while 33.33% of the samples were distorted between 8-24 weeks exposure. Approximately 8.33% of the samples did not show distortions in DNA while 33.33% of the samples had their DNA damaged twice, both at 8 and at 24 weeks of exposure. The study showed that radiation from the 900 MHz GSM antenna is potent enough to cause distortions to DNA of H. sabdariffa even within 2-8 weeks of exposure. DNA damage was also independent of the distance from the antenna. These observations would qualify emissions from GSM mast as environmental hazard to the existence of plant biodiversities and all life forms in general. These results will trigger efforts to prevent further erosion of plant genetic resources which have been threatening food security and also the risks posed to living organisms, thereby making our environment very safe for our existence while we still continue to enjoy the benefits of the GSM technology.Keywords: damage, DNA, GSM antenna, radiation
Procedia PDF Downloads 33925364 Algorithm Optimization to Sort in Parallel by Decreasing the Number of the Processors in SIMD (Single Instruction Multiple Data) Systems
Authors: Ali Hosseini
Abstract:
Paralleling is a mechanism to decrease the time necessary to execute the programs. Sorting is one of the important operations to be used in different systems in a way that the proper function of many algorithms and operations depend on sorted data. CRCW_SORT algorithm executes ‘N’ elements sorting in O(1) time on SIMD (Single Instruction Multiple Data) computers with n^2/2-n/2 number of processors. In this article having presented a mechanism by dividing the input string by the hinge element into two less strings the number of the processors to be used in sorting ‘N’ elements in O(1) time has decreased to n^2/8-n/4 in the best state; by this mechanism the best state is when the hinge element is the middle one and the worst state is when it is minimum. The findings from assessing the proposed algorithm by other methods on data collection and number of the processors indicate that the proposed algorithm uses less processors to sort during execution than other methods.Keywords: CRCW, SIMD (Single Instruction Multiple Data) computers, parallel computers, number of the processors
Procedia PDF Downloads 31025363 Increasing the System Availability of Data Centers by Using Virtualization Technologies
Authors: Chris Ewe, Naoum Jamous, Holger Schrödl
Abstract:
Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.Keywords: availability, cloud computing IT service, quality of service, service level agreement, virtualization
Procedia PDF Downloads 53725362 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt
Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer
Abstract:
Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening
Procedia PDF Downloads 5225361 Review of Different Machine Learning Algorithms
Authors: Syed Romat Ali Shah, Bilal Shoaib, Saleem Akhtar, Munib Ahmad, Shahan Sadiqui
Abstract:
Classification is a data mining technique, which is recognizedon Machine Learning (ML) algorithm. It is used to classifythe individual articlein a knownofinformation into a set of predefinemodules or group. Web mining is also a portion of that sympathetic of data mining methods. The main purpose of this paper to analysis and compare the performance of Naïve Bayse Algorithm, Decision Tree, K-Nearest Neighbor (KNN), Artificial Neural Network (ANN)and Support Vector Machine (SVM). This paper consists of different ML algorithm and their advantages and disadvantages and also define research issues.Keywords: Data Mining, Web Mining, classification, ML Algorithms
Procedia PDF Downloads 30325360 Using Genetic Algorithms and Rough Set Based Fuzzy K-Modes to Improve Centroid Model Clustering Performance on Categorical Data
Authors: Rishabh Srivastav, Divyam Sharma
Abstract:
We propose an algorithm to cluster categorical data named as ‘Genetic algorithm initialized rough set based fuzzy K-Modes for categorical data’. We propose an amalgamation of the simple K-modes algorithm, the Rough and Fuzzy set based K-modes and the Genetic Algorithm to form a new algorithm,which we hypothesise, will provide better Centroid Model clustering results, than existing standard algorithms. In the proposed algorithm, the initialization and updation of modes is done by the use of genetic algorithms while the membership values are calculated using the rough set and fuzzy logic.Keywords: categorical data, fuzzy logic, genetic algorithm, K modes clustering, rough sets
Procedia PDF Downloads 24725359 Forecasting Amman Stock Market Data Using a Hybrid Method
Authors: Ahmad Awajan, Sadam Al Wadi
Abstract:
In this study, a hybrid method based on Empirical Mode Decomposition and Holt-Winter (EMD-HW) is used to forecast Amman stock market data. First, the data are decomposed by EMD method into Intrinsic Mode Functions (IMFs) and residual components. Then, all components are forecasted by HW technique. Finally, forecasting values are aggregated together to get the forecasting value of stock market data. Empirical results showed that the EMD- HW outperform individual forecasting models. The strength of this EMD-HW lies in its ability to forecast non-stationary and non- linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy comparing with eight existing forecasting methods based on the five forecast error measures.Keywords: Holt-Winter method, empirical mode decomposition, forecasting, time series
Procedia PDF Downloads 12925358 Functional Surfaces and Edges for Cutting and Forming Tools Created Using Directed Energy Deposition
Authors: Michal Brazda, Miroslav Urbanek, Martina Koukolikova
Abstract:
This work focuses on the development of functional surfaces and edges for cutting and forming tools created through the Directed Energy Deposition (DED) technology. In the context of growing challenges in modern engineering, additive technologies, especially DED, present an innovative approach to manufacturing tools for forming and cutting. One of the key features of DED is its ability to precisely and efficiently deposit Fully dense metals from powder feedstock, enabling the creation of complex geometries and optimized designs. Gradually, it becomes an increasingly attractive choice for tool production due to its ability to achieve high precision while simultaneously minimizing waste and material costs. Tools created using DED technology gain significant durability through the utilization of high-performance materials such as nickel alloys and tool steels. For high-temperature applications, Nimonic 80A alloy is applied, while for cold applications, M2 tool steel is used. The addition of ceramic materials, such as tungsten carbide, can significantly increase the tool's resistance. The introduction of functionally graded materials is a significant contribution, opening up new possibilities for gradual changes in the mechanical properties of the tool and optimizing its performance in different sections according to specific requirements. In this work, you will find an overview of individual applications and their utilization in the industry. Microstructural analyses have been conducted, providing detailed insights into the structure of individual components alongside examinations of the mechanical properties and tool life. These analyses offer a deeper understanding of the efficiency and reliability of the created tools, which is a key element for successful development in the field of cutting and forming tools. The production of functional surfaces and edges using DED technology can result in financial savings, as the entire tool doesn't have to be manufactured from expensive special alloys. The tool can be made from common steel, onto which a functional surface from special materials can be applied. Additionally, it allows for tool repairs after wear and tear, eliminating the need for producing a new part and contributing to an overall cost while reducing the environmental footprint. Overall, the combination of DED technology, functionally graded materials, and verified technologies collectively set a new standard for innovative and efficient development of cutting and forming tools in the modern industrial environment.Keywords: additive manufacturing, directed energy deposition, DED, laser, cutting tools, forming tools, steel, nickel alloy
Procedia PDF Downloads 5025357 Building Information Modeling-Based Information Exchange to Support Facilities Management Systems
Authors: Sandra T. Matarneh, Mark Danso-Amoako, Salam Al-Bizri, Mark Gaterell
Abstract:
Today’s facilities are ever more sophisticated and the need for available and reliable information for operation and maintenance activities is vital. The key challenge for facilities managers is to have real-time accurate and complete information to perform their day-to-day activities and to provide their senior management with accurate information for decision-making process. Currently, there are various technology platforms, data repositories, or database systems such as Computer-Aided Facility Management (CAFM) that are used for these purposes in different facilities. In most current practices, the data is extracted from paper construction documents and is re-entered manually in one of these computerized information systems. Construction Operations Building information exchange (COBie), is a non-proprietary data format that contains the asset non-geometric data which was captured and collected during the design and construction phases for owners and facility managers use. Recently software vendors developed add-in applications to generate COBie spreadsheet automatically. However, most of these add-in applications are capable of generating a limited amount of COBie data, in which considerable time is still required to enter the remaining data manually to complete the COBie spreadsheet. Some of the data which cannot be generated by these COBie add-ins is essential for facilities manager’s day-to-day activities such as job sheet which includes preventive maintenance schedules. To facilitate a seamless data transfer between BIM models and facilities management systems, we developed a framework that enables automated data generation using the data extracted directly from BIM models to external web database, and then enabling different stakeholders to access to the external web database to enter the required asset data directly to generate a rich COBie spreadsheet that contains most of the required asset data for efficient facilities management operations. The proposed framework is a part of ongoing research and will be demonstrated and validated on a typical university building. Moreover, the proposed framework supplements the existing body of knowledge in facilities management domain by providing a novel framework that facilitates seamless data transfer between BIM models and facilities management systems.Keywords: building information modeling, BIM, facilities management systems, interoperability, information management
Procedia PDF Downloads 11625356 Finite Element Analysis of a Glass Facades Supported by Pre-Tensioned Cable Trusses
Authors: Khair Al-Deen Bsisu, Osama Mahmoud Abuzeid
Abstract:
Significant technological advances have been achieved in the design and building construction of steel and glass in the last two decades. The metal glass support frame has been replaced by further sophisticated technological solutions, for example, the point fixed glazing systems. The minimization of the visual mass has reached extensive possibilities through the evolution of technology in glass production and the better understanding of the structural potential of glass itself, the technological development of bolted fixings, the introduction of the glazing support attachments of the glass suspension systems and the use for structural stabilization of cables that reduce to a minimum the amount of metal used. The variability of solutions of tension structures, allied to the difficulties related to geometric and material non-linear behavior, usually overrules the use of analytical solutions, letting numerical analysis as the only general approach to the design and analysis of tension structures. With the characteristics of low stiffness, lightweight, and small damping, tension structures are obviously geometrically nonlinear. In fact, analysis of cable truss is not only one of the most difficult nonlinear analyses because the analysis path may have rigid-body modes, but also a time consuming procedure. Non-linear theory allowing for large deflections is used. The flexibility of supporting members was observed to influence the stresses in the pane considerably in some cases. No other class of architectural structural systems is as dependent upon the use of digital computers as are tensile structures. Besides complexity, the process of design and analysis of tension structures presents a series of specificities, which usually lead to the use of special purpose programs, instead of general purpose programs (GPPs), such as ANSYS. In a special purpose program, part of the design know how is embedded in program routines. It is very probable that this type of program will be the option of the final user, in design offices. GPPs offer a range of types of analyses and modeling options. Besides, traditional GPPs are constantly being tested by a large number of users, and are updated according to their actual demands. This work discusses the use of ANSYS for the analysis and design of tension structures, such as cable truss structures under wind and gravity loadings. A model to describe the glass panels working in coordination with the cable truss was proposed. Under the proposed model, a FEM model of the glass panels working in coordination with the cable truss was established.Keywords: Glass Construction material, Facades, Finite Element, Pre-Tensioned Cable Truss
Procedia PDF Downloads 28025355 Investigating Cloud Forensics: Challenges, Tools, and Practical Case Studies
Authors: Noha Badkook, Maryam Alsubaie, Samaher Dawood, Enas Khairallah
Abstract:
Cloud computing has introduced transformative benefits in data storage and accessibility while posing unique forensic challenges. This paper explores cloud forensics, focusing on investigating and analyzing evidence from cloud environments to address issues such as unauthorized data access, manipulation, and breaches. The research highlights the practical use of opensource forensic tools like Autopsy and Bulk Extractor in realworld scenarios, including unauthorized data sharing via Google Drive and the misuse of personal cloud storage for sensitive information leaks. This work underscores the growing importance of robust forensic procedures and accessible tools in ensuring data security and accountability in cloud ecosystems.Keywords: cloud forensic, tools, challenge, autopsy, bulk extractor
Procedia PDF Downloads 225354 National Digital Soil Mapping Initiatives in Europe: A Review and Some Examples
Authors: Dominique Arrouays, Songchao Chen, Anne C. Richer-De-Forges
Abstract:
Soils are at the crossing of many issues such as food and water security, sustainable energy, climate change mitigation and adaptation, biodiversity protection, human health and well-being. They deliver many ecosystem services that are essential to life on Earth. Therefore, there is a growing demand for soil information on a national and global scale. Unfortunately, many countries do not have detailed soil maps, and, when existing, these maps are generally based on more or less complex and often non-harmonized soil classifications. An estimate of their uncertainty is also often missing. Thus, there are not easy to understand and often not properly used by end-users. Therefore, there is an urgent need to provide end-users with spatially exhaustive grids of essential soil properties, together with an estimate of their uncertainty. One way to achieve this is digital soil mapping (DSM). The concept of DSM relies on the hypothesis that soils and their properties are not randomly distributed, but that they depend on the main soil-forming factors that are climate, organisms, relief, parent material, time (age), and position in space. All these forming factors can be approximated using several exhaustive spatial products such as climatic grids, remote sensing products or vegetation maps, digital elevation models, geological or lithological maps, spatial coordinates of soil information, etc. Thus, DSM generally relies on models calibrated with existing observed soil data (point observations or maps) and so-called “ancillary co-variates” that come from other available spatial products. Then the model is generalized on grids where soil parameters are unknown in order to predict them, and the prediction performances are validated using various methods. With the growing demand for soil information at a national and global scale and the increase of available spatial co-variates national and continental DSM initiatives are continuously increasing. This short review illustrates the main national and continental advances in Europe, the diversity of the approaches and the databases that are used, the validation techniques and the main scientific and other issues. Examples from several countries illustrate the variety of products that were delivered during the last ten years. The scientific production on this topic is continuously increasing and new models and approaches are developed at an incredible speed. Most of the digital soil mapping (DSM) products rely mainly on machine learning (ML) prediction models and/or the use or pedotransfer functions (PTF) in which calibration data come from soil analyses performed in labs or for existing conventional maps. However, some scientific issues remain to be solved and also political and legal ones related, for instance, to data sharing and to different laws in different countries. Other issues related to communication to end-users and education, especially on the use of uncertainty. Overall, the progress is very important and the willingness of institutes and countries to join their efforts is increasing. Harmonization issues are still remaining, mainly due to differences in classifications or in laboratory standards between countries. However numerous initiatives are ongoing at the EU level and also at the global level. All these progress are scientifically stimulating and also promissing to provide tools to improve and monitor soil quality in countries, EU and at the global level.Keywords: digital soil mapping, global soil mapping, national and European initiatives, global soil mapping products, mini-review
Procedia PDF Downloads 18425353 Using Audio-Visual Aids and Computer-Assisted Language Instruction to Overcome Learning Difficulties of Reading in Students of Special Needs
Authors: Sadeq Al Yaari, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari
Abstract:
Background & aims: Reading is a receptive skill whose importance could involve abilities' variance from linguistic standard. Several evidences support the hypothesis stating that the more you read the better you write, with a different impact for speech language therapists (SLTs) who use audio-visual aids and computer-assisted language instruction (CALI) and those who do not. Methods: Here we made use of audio-visual aids and CALI for teaching reading skill to a group of 40 students of special needs of both sexes (range between 8 and 18 years old) at al-Malādh school for teaching students of special needs in Dhamar (Yemen) while another group of the same number is taught using ordinary teaching methods. Pre-and-posttests have been administered at the beginning and the end of the semester (Before and after teaching the reading course). The purpose was to understand the differences between the levels of the students of special needs to see to what extent audio-visual aids and CALI are useful for them. The two groups were taught by the same instructor under the same circumstances in the same school. Both quantitative and qualitative procedures were used to analyze the data. Results: The overall findings revealed that audio-visual aids and CALI are very useful for teaching reading to students of special needs and this can be seen in the scores of the treatment group’s subjects (7.0%, in post-test vs.2.5% in pre-test). In comparison to the scores of the second group’s subjects (where audio-visual aids and CALI were not used) (2.2% in both pre-and-posttests), the first group subjects have overcome reading tasks and this can be observed in their performance in the posttest. Compared with males, females’ performance was better (1466 scores (7.3%) vs. 1371 scores (6.8%). Qualitative and statistical analyses showed that such comprehension is absolutely due to the use of audio-visual aids and CALI and nothing else. These outcomes confirm the evidence of the significance of using audio-visual aids and CALI as effective means for teaching receptive skills in general and reading skill in particular.Keywords: reading, receptive skills, audio-visual aids, CALI, students, special needs, SLTs
Procedia PDF Downloads 4925352 Electrospun TiO2/Nylon-6 Nanofiber Mat: Improved Hydrophilicity Properties
Authors: Roshank Haghighat, Laleh Maleknia
Abstract:
In this study, electrospun TiO2/nylon-6 nanofiber mats were successfully prepared. The nanofiber mats were characterized by SEM, FE-SEM, TEM, XRD, WCA, and EDX analyses. The results revealed that fibers in different distinct sizes (nano and subnano scale) were obtained with the electrospinning parameters. The presence of a small amount of TiO2 in nylon-6 solution was found to improve the hydrophilicity (antifouling effect), mechanical strength, antimicrobial and UV protecting ability of electrospun mats. The resultant nylon-6/TiO2 antimicrobial spider-net like composite mat with antifouling effect may be a potential candidate for future water filter applications, and its improved UV blocking ability will also make it a potential candidate for protective clothing.Keywords: electrospinning, hydrophilicity, antimicrobial, nanocomposite, nylon-6/TiO2
Procedia PDF Downloads 34925351 Data Security and Privacy Challenges in Cloud Computing
Authors: Amir Rashid
Abstract:
Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud
Procedia PDF Downloads 29925350 Architectural Thinking in a Time of Climate Emergency
Authors: Manoj Parmar
Abstract:
The article uses reflexivity as a research method to investigate and propose an architectural theory plan for climate change. It hypothecates that to discuss or formulate discourse on "Architectural Thinking in a Time of Climate Emergency," firstly, we need to understand the modes of integration that enable architectural thinking with climate change. The study intends to study the various integration modes that have evolved historically and situate them in time. Subsequently, it analyses the integration pattern, challenges the existing model, and finds a way towards climate change as central to architectural thinking. The study is fundamental on-premises that ecology and climate change scholarship has consistently out lashed the asymmetrical and nonlinear knowledge and needs approaches for architecture that are less burden to climate change to people and minimize its impact on ecology.Keywords: climate change, architectural theory, reflexivity, modernity
Procedia PDF Downloads 28525349 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 8625348 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector
Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar
Abstract:
Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability
Procedia PDF Downloads 18425347 Investigation of Overstrength of Dual System by Non-Linear Static and Dynamic Analyses
Authors: Nina Øystad-Larsen, Miran Cemalovic, Amir M. Kaynia
Abstract:
The nonlinear static and dynamic analysis procedures presented in EN 1998-1 for the structural response of a RC wall-frame building are assessed. The structure is designed according to the guidelines for high ductility (DCH) in 1998-1. The finite element packages SeismoStruct and OpenSees are utilized and evaluated. The structural response remains nearly in the elastic range even though the building was designed for high ductility. The overstrength is a result of oversized and heavily reinforced members, with emphasis on the lower storey walls. Nonlinear response history analysis in the software packages give virtually identical results for displacements.Keywords: behaviour factor, dual system, OpenSEES, overstrength, seismostruct
Procedia PDF Downloads 40725346 Design and Implementation of a Geodatabase and WebGIS
Authors: Sajid Ali, Dietrich Schröder
Abstract:
The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application
Procedia PDF Downloads 34125345 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis
Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi
Abstract:
The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH researchKeywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis
Procedia PDF Downloads 9125344 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 6825343 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study
Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos
Abstract:
This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.Keywords: in-place devices, IoT, human-centred data-analytics, spatial design
Procedia PDF Downloads 19725342 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce
Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada
Abstract:
With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.Keywords: distributed algorithm, MapReduce, multi-class, support vector machine
Procedia PDF Downloads 40125341 Information Management Approach in the Prediction of Acute Appendicitis
Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki
Abstract:
This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree
Procedia PDF Downloads 35125340 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.Keywords: multi-objective, analysis, data flow, freight delivery, methodology
Procedia PDF Downloads 180