Search results for: open data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27146

Search results for: open data

24356 Evaluation of the Environmental Risk from the Co-Deposition of Waste Rock Material and Fly Ash

Authors: A. Mavrikos, N. Petsas, E. Kaltsi, D. Kaliampakos

Abstract:

The lignite-fired power plants in the Western Macedonia Lignite Center produce more than 8 106 t of fly ash per year. Approximately 90% of this quantity is used for restoration-reclamation of exhausted open-cast lignite mines and slope stabilization of the overburden. The purpose of this work is to evaluate the environmental behavior of the mixture of waste rock and fly ash that is being used in the external deposition site of the South Field lignite mine. For this reason, a borehole was made within the site and 86 samples were taken and subjected to chemical analyses and leaching tests. The results showed very limited leaching of trace elements and heavy metals from this mixture. Moreover, when compared to the limit values set for waste acceptable in inert waste landfills, only few excesses were observed, indicating only minor risk for groundwater pollution. However, due to the complexity of both the leaching process and the contaminant pathway, more boreholes and analyses should be made in nearby locations and a systematic groundwater monitoring program should be implemented both downstream and within the external deposition site.

Keywords: co-deposition, fly ash, leaching tests, lignite, waste rock

Procedia PDF Downloads 238
24355 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects

Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour

Abstract:

One of the main problems of the design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnel projects in which there is a number of tunnels and different professional teams involved. In this regard, comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels, such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate, and so forth, can be calculated and reported in a standard format.

Keywords: engineering geology, rock mass classification, rock mechanic, tunnel

Procedia PDF Downloads 81
24354 Intercultural Competence in Teaching Mediation to Students of Legal English

Authors: Paulina Dwuznik

Abstract:

For students of legal English, the skill of mediation is of special importance as it constitutes part of their everyday work. Developing the skill of mediation requires developing linguistic, communicative, textual, pragmatic, interactive, social, and intercultural competencies. The study conducted at the Open University of the University of Warsaw compared the results of a questionnaire concerning the needs of legal professionals relating to mediation tasks, which they perform at work with the analysis of the content of different legal English handbooks with special stress on the development of intercultural competence necessary in interlinguistic mediation. The study found that legal English handbooks focus mainly on terminology study, but some of them extend students' intercultural competence in a way which may help them to perform tasks of mediating concepts, texts, and communication. The author of the paper will present the correlation between intercultural competence and mediation skill and give some examples of mediation tasks which may be based on comparative intercultural content of some chosen academic legal English handbooks.

Keywords: intercultural competence, legal English, mediation skill, teaching

Procedia PDF Downloads 158
24353 The Underestimate of the Annual Maximum Rainfall Depths Due to Coarse Time Resolution Data

Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Tommaso Picciafuoco, Corrado Corradini

Abstract:

A considerable part of rainfall data to be used in the hydrological practice is available in aggregated form within constant time intervals. This can produce undesirable effects, like the underestimate of the annual maximum rainfall depth, Hd, associated with a given duration, d, that is the basic quantity in the development of rainfall depth-duration-frequency relationships and in determining if climate change is producing effects on extreme event intensities and frequencies. The errors in the evaluation of Hd from data characterized by a coarse temporal aggregation, ta, and a procedure to reduce the non-homogeneity of the Hd series are here investigated. Our results indicate that: 1) in the worst conditions, for d=ta, the estimation of a single Hd value can be affected by an underestimation error up to 50%, while the average underestimation error for a series with at least 15-20 Hd values, is less than or equal to 16.7%; 2) the underestimation error values follow an exponential probability density function; 3) each very long time series of Hd contains many underestimated values; 4) relationships between the non-dimensional ratio ta/d and the average underestimate of Hd, derived from continuous rainfall data observed in many stations of Central Italy, may overcome this issue; 5) these equations should allow to improve the Hd estimates and the associated depth-duration-frequency curves at least in areas with similar climatic conditions.

Keywords: central Italy, extreme events, rainfall data, underestimation errors

Procedia PDF Downloads 191
24352 Objective Evaluation on Medical Image Compression Using Wavelet Transformation

Authors: Amhimmid Mohammed Saffour, Mustafa Mohamed Abdullah

Abstract:

The use of computers for handling image data in the healthcare is growing. However, the amount of data produced by modern image generating techniques is vast. This data might be a problem from a storage point of view or when the data is sent over a network. This paper using wavelet transform technique for medical images compression. MATLAB program, are designed to evaluate medical images storage and transmission time problem at Sebha Medical Center Libya. In this paper, three different Computed Tomography images which are abdomen, brain and chest have been selected and compressed using wavelet transform. Objective evaluation has been performed to measure the quality of the compressed images. For this evaluation, the results show that the Peak Signal to Noise Ratio (PSNR) which indicates the quality of the compressed image is ranging from (25.89db to 34.35db for abdomen images, 23.26db to 33.3db for brain images and 25.5db to 36.11db for chest images. These values shows that the compression ratio is nearly to 30:1 is acceptable.

Keywords: medical image, Matlab, image compression, wavelet's, objective evaluation

Procedia PDF Downloads 286
24351 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction

Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun

Abstract:

The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.

Keywords: usability, qualitative data, text-processing algorithm, natural language processing

Procedia PDF Downloads 285
24350 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses

Authors: Matthew Baucum

Abstract:

With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.

Keywords: FMRI, machine learning, meta-analysis, text analysis

Procedia PDF Downloads 449
24349 Differentiation between Different Rangeland Sites Using Principal Component Analysis in Semi-Arid Areas of Sudan

Authors: Nancy Ibrahim Abdalla, Abdelaziz Karamalla Gaiballa

Abstract:

Rangelands in semi-arid areas provide a good source for feeding huge numbers of animals and serving environmental, economic and social importance; therefore, these areas are considered economically very important for the pastoral sector in Sudan. This paper investigates the means of differentiating between different rangelands sites according to soil types using principal component analysis to assist in monitoring and assessment purposes. Three rangeland sites were identified in the study area as flat sandy sites, sand dune site, and hard clay site. Principal component analysis (PCA) was used to reduce the number of factors needed to distinguish between rangeland sites and produce a new set of data including the most useful spectral information to run satellite image processing. It was performed using selected types of data (two vegetation indices, topographic data and vegetation surface reflectance within the three bands of MODIS data). Analysis with PCA indicated that there is a relatively high correspondence between vegetation and soil of the total variance in the data set. The results showed that the use of the principal component analysis (PCA) with the selected variables showed a high difference, reflected in the variance and eigenvalues and it can be used for differentiation between different range sites.

Keywords: principal component analysis, PCA, rangeland sites, semi-arid areas, soil types

Procedia PDF Downloads 187
24348 The Use of Optical-Radar Remotely-Sensed Data for Characterizing Geomorphic, Structural and Hydrologic Features and Modeling Groundwater Prospective Zones in Arid Zones

Authors: Mohamed Abdelkareem

Abstract:

Remote sensing data contributed on predicting the prospective areas of water resources. Integration of microwave and multispectral data along with climatic, hydrologic, and geological data has been used here. In this article, Sentinel-2, Landsat-8 Operational Land Imager (OLI), Shuttle Radar Topography Mission (SRTM), Tropical Rainfall Measuring Mission (TRMM), and Advanced Land Observing Satellite (ALOS) Phased Array Type L‐band Synthetic Aperture Radar (PALSAR) data were utilized to identify the geological, hydrologic and structural features of Wadi Asyuti which represents a defunct tributary of the Nile basin, in the eastern Sahara. The image transformation of Sentinel-2 and Landsat-8 data allowed characterizing the different varieties of rock units. Integration of microwave remotely-sensed data and GIS techniques provided information on physical characteristics of catchments and rainfall zones that are of a crucial role for mapping groundwater prospective zones. A fused Landsat-8 OLI and ALOS/PALSAR data improved the structural elements that difficult to reveal using optical data. Lineament extraction and interpretation indicated that the area is clearly shaped by the NE-SW graben that is cut by NW-SE trend. Such structures allowed the accumulation of thick sediments in the downstream area. Processing of recent OLI data acquired on March 15, 2014, verified the flood potential maps and offered the opportunity to extract the extent of the flooding zone of the recent flash flood event (March 9, 2014), as well as revealed infiltration characteristics. Several layers including geology, slope, topography, drainage density, lineament density, soil characteristics, rainfall, and morphometric characteristics were combined after assigning a weight for each using a GIS-based knowledge-driven approach. The results revealed that the predicted groundwater potential zones (GPZs) can be arranged into six distinctive groups, depending on their probability for groundwater, namely very low, low, moderate, high very, high, and excellent. Field and well data validated the delineated zones.

Keywords: GIS, remote sensing, groundwater, Egypt

Procedia PDF Downloads 98
24347 Intelligent Production Machine

Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan

Abstract:

This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.

Keywords: cutting process, sound processing, intelligent late, sound analysis

Procedia PDF Downloads 334
24346 The Effectiveness and Accuracy of the Schulte Holt IOL Toric Calculator Processor in Comparison to Manually Input Data into the Barrett Toric IOL Calculator

Authors: Gabrielle Holt

Abstract:

This paper is looking to prove the efficacy of the Schulte Holt IOL Toric Calculator Processor (Schulte Holt ITCP). It has been completed using manually inputted data into the Barrett Toric Calculator and comparing the number of minutes taken to complete the Toric calculations, the number of errors identified during completion, and distractions during completion. It will then compare that data to the number of minutes taken for the Schulte Holt ITCP to complete also, using the Barrett method, as well as the number of errors identified in the Schulte Holt ITCP. The data clearly demonstrate a momentous advantage to the Schulte Holt ITCP and notably reduces time spent doing Toric Calculations, as well as reducing the number of errors. With the ever-growing number of cataract surgeries taking place around the world and the waitlists increasing -the Schulte Holt IOL Toric Calculator Processor may well demonstrate a way forward to increase the availability of ophthalmologists and ophthalmic staff while maintaining patient safety.

Keywords: Toric, toric lenses, ophthalmology, cataract surgery, toric calculations, Barrett

Procedia PDF Downloads 94
24345 Change Point Detection Using Random Matrix Theory with Application to Frailty in Elderly Individuals

Authors: Malika Kharouf, Aly Chkeir, Khac Tuan Huynh

Abstract:

Detecting change points in time series data is a challenging problem, especially in scenarios where there is limited prior knowledge regarding the data’s distribution and the nature of the transitions. We present a method designed for detecting changes in the covariance structure of high-dimensional time series data, where the number of variables closely matches the data length. Our objective is to achieve unbiased test statistic estimation under the null hypothesis. We delve into the utilization of Random Matrix Theory to analyze the behavior of our test statistic within a high-dimensional context. Specifically, we illustrate that our test statistic converges pointwise to a normal distribution under the null hypothesis. To assess the effectiveness of our proposed approach, we conduct evaluations on a simulated dataset. Furthermore, we employ our method to examine changes aimed at detecting frailty in the elderly.

Keywords: change point detection, hypothesis tests, random matrix theory, frailty in elderly

Procedia PDF Downloads 54
24344 Main Cause of Children's Deaths in Indigenous Wayuu Community from Department of La Guajira: A Research Developed through Data Mining Use

Authors: Isaura Esther Solano Núñez, David Suarez

Abstract:

The main purpose of this research is to discover what causes death in children of the Wayuu community, and deeply analyze those results in order to take corrective measures to properly control infant mortality. We consider important to determine the reasons that are producing early death in this specific type of population, since they are the most vulnerable to high risk environmental conditions. In this way, the government, through competent authorities, may develop prevention policies and the right measures to avoid an increase of this tragic fact. The methodology used to develop this investigation is data mining, which consists in gaining and examining large amounts of data to produce new and valuable information. Through this technique it has been possible to determine that the child population is dying mostly from malnutrition. In short, this technique has been very useful to develop this study; it has allowed us to transform large amounts of information into a conclusive and important statement, which has made it easier to take appropriate steps to resolve a particular situation.

Keywords: malnutrition, data mining, analytical, descriptive, population, Wayuu, indigenous

Procedia PDF Downloads 159
24343 Detroit Latinx Adolescents Depend on Relationships, Recreation, and Internal Homeostasis to Live their Healthiest Lives

Authors: Jenny Clift, Rebeccah Sokol, LaTricia Mitchell, Nicholas Alexander, Karissa Rusnick

Abstract:

Aims: This study sought to identify prevalent promotive factors supporting urban adolescent health and wellbeing, per adolescent and caregiver reports. Setting: The research team conducted online surveys with adolescent (n=520) and caregiver (n=73) respondents from a predominately Latinx urban high school. Methodology: A cross-sectional, qualitative study. Analysis: Inductive thematic analysis was used to analyze responses to open-ended questions. -Findings. Adolescent and caregiver respondents identified promotive factors (eight and six, respectively) that encourage adolescent health and well-being. Supportive relationships were the most frequently reported factor among adolescents (68%) and caregivers (55%). Implications: Health promotion interventions among adolescents should consider how to promote relationships to counteract negative social determinants of health (SDH) and promote optimal quality of life.

Keywords: Latinx adolescents, health and wellbeing, social determinants of health, school

Procedia PDF Downloads 88
24342 Application of the Mobile Phone for Occupational Self-Inspection Program in Small-Scale Industries

Authors: Jia-Sin Li, Ying-Fang Wang, Cheing-Tong Yan

Abstract:

In this study, an integrated approach of Google Spreadsheet and QR code which is free internet resources was used to improve the inspection procedure. The mobile phone Application(App)was also designed to combine with a web page to create an automatic checklist in order to provide a new integrated information of inspection management system. By means of client-server model, the client App is developed for Android mobile OS and the back end is a web server. It can set up App accounts including authorized data and store some checklist documents in the website. The checklist document URL could generate QR code first and then print and paste on the machine. The user can scan the QR code by the app and filled the checklist in the factory. In the meanwhile, the checklist data will send to the server, it not only save the filled data but also executes the related functions and charts. On the other hand, it also enables auditors and supervisors to facilitate the prevention and response to hazards, as well as immediate report data checks. Finally, statistics and professional analysis are performed using inspection records and other relevant data to not only improve the reliability, integrity of inspection operations and equipment loss control, but also increase plant safety and personnel performance. Therefore, it suggested that the traditional paper-based inspection method could be replaced by the APP which promotes the promotion of industrial security and reduces human error.

Keywords: checklist, Google spreadsheet, APP, self-inspection

Procedia PDF Downloads 118
24341 Industry 4.0 and Supply Chain Integration: Case of Tunisian Industrial Companies

Authors: Rym Ghariani, Ghada Soltane, Younes Boujelbene

Abstract:

Industry 4.0, a set of emerging smart and digital technologies, has been the main focus of operations management researchers and practitioners in recent years. The objective of this research paper is to study the impact of Industry 4.0 on the integration of the supply chain (SCI) in Tunisian industrial companies. A conceptual model to study the relationship between Industry 4.0 technologies and supply chain integration was designed. This model contains three explained variables (Big data, Internet of Things, and Robotics) and one variable to be explained (supply chain integration). In order to answer our research questions and investigate the research hypotheses, principal component analysis and discriminant analysis were used using SPSS26 software. The results reveal that there is a statistically positive impact significant impact of Industry 4.0 (Big data, Internet of Things and Robotics) on the integration of the supply chain. Interestingly, big data has a greater positive impact on supply chain integration than the Internet of Things and robotics.

Keywords: industry 4.0 (I4.0), big data, internet of things, robotics, supply chain integration

Procedia PDF Downloads 61
24340 A Tagging Algorithm in Augmented Reality for Mobile Device Screens

Authors: Doga Erisik, Ahmet Karaman, Gulfem Alptekin, Ozlem Durmaz Incel

Abstract:

Augmented reality (AR) is a type of virtual reality aiming to duplicate real world’s environment on a computer’s video feed. The mobile application, which is built for this project (called SARAS), enables annotating real world point of interests (POIs) that are located near mobile user. In this paper, we aim at introducing a robust and simple algorithm for placing labels in an augmented reality system. The system places labels of the POIs on the mobile device screen whose GPS coordinates are given. The proposed algorithm is compared to an existing one in terms of energy consumption and accuracy. The results show that the proposed algorithm gives better results in energy consumption and accuracy while standing still, and acceptably accurate results when driving. The technique provides benefits to AR browsers with its open access algorithm. Going forward, the algorithm will be improved to more rapidly react to position changes while driving.

Keywords: accurate tagging algorithm, augmented reality, localization, location-based AR

Procedia PDF Downloads 374
24339 Analysing Competitive Advantage of IoT and Data Analytics in Smart City Context

Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue

Abstract:

The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic has not only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of normal design, construction, and operation of cities provides a unique opportunity to improve the connection between people. The Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the research contribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.

Keywords: data analytics, smart cities, competitive advantage, internet of things

Procedia PDF Downloads 134
24338 Best Season for Seismic Survey in Zaria Area, Nigeria: Data Quality and Implications

Authors: Ibe O. Stephen, Egwuonwu N. Gabriel

Abstract:

Variations in seismic P-wave velocity and depth resolution resulting from variations in subsurface water saturation were investigated in this study in order to determine the season of the year that gives the most reliable P-wave velocity and depth resolution of the subsurface in Zaria Area, Nigeria. A 2D seismic refraction tomography technique involving an ABEM Terraloc MK6 Seismograph was used to collect data across a borehole of standard log with the centre of the spread situated at the borehole site. Using the same parameters this procedure was repeated along the same spread for at least once in a month for at least eight months in a year for four years. The choice for each survey time depended on when there was significant variation in rainfall data. The seismic data collected were tomographically inverted. The results suggested that the average P-wave velocity ranges of the subsurface in the area are generally higher when the ground was wet than when it was dry. The results also suggested that the overburden of about 9.0 m in thickness, the weathered basement of about 14.0 m in thickness and the fractured basement at a depth of about 23.0 m best fitted the borehole log. This best fit was consistently obtained in the months between March and May when the average total rainfall was about 44.8 mm in the area. The results had also shown that the velocity ranges in both dry and wet formations fall within the standard ranges as provided in literature. In terms of velocity, this study has not in any way clearly distinguished the quality of the results of the seismic data obtained when the subsurface was dry from the results of the data collected when the subsurface was wet. It was concluded that for more detailed and reliable seismic studies in Zaria Area and its environs with similar climatic condition, the surveys are best conducted between March and May. The most reliable seismic data for depth resolution are most likely obtainable in the area between March and May.

Keywords: best season, variations in depth resolution, variations in P-wave velocity, variations in subsurface water saturation, Zaria area

Procedia PDF Downloads 290
24337 Study of the Relationship between the Roughness Configuration of Channel Bottom and the Creation of Vortices at the Rough Area: Numerical Modelling

Authors: Youb Said, Fourar Ali

Abstract:

To describe the influence of bottom roughness on the free surface flows by numerical modeling, a two-dimensional model was developed. The equations of continuity and momentum (Naviers Stokes equations) are solved by the finite volume method. We considered a turbulent flow in an open channel with a bottom roughness. For our simulations, the K-ε model was used. After setting the initial and boundary conditions and solve the equations set, we were able to achieve the following results: vortex forming in the hollow causing substantial energy dissipation in the obstacle areas that form the bottom roughness. The comparison of our results with experimental ones shows a good agreement in terms of the results in the rough area. However, in other areas, differences were more or less important. These differences are in areas far from the bottom, especially the free surface area just after the bottom. These disagreements are probably due to experimental constants used by the k-ε model.

Keywords: modeling, free surface flow, turbulence, bottom roughness, finite volume, K-ε model, energy dissipation

Procedia PDF Downloads 382
24336 Cost Efficient Receiver Tube Technology for Eco-Friendly Concentrated Solar Thermal Applications

Authors: M. Shiva Prasad, S. R. Atchuta, T. Vijayaraghavan, S. Sakthivel

Abstract:

The world is in need of efficient energy conversion technologies which are affordable, accessible, and sustainable with eco-friendly nature. Solar energy is one of the cornerstones for the world’s economic growth because of its abundancy with zero carbon pollution. Among the various solar energy conversion technologies, solar thermal technology has attracted a substantial renewed interest due to its diversity and compatibility in various applications. Solar thermal systems employ concentrators, tracking systems and heat engines for electricity generation which lead to high cost and complexity in comparison with photovoltaics; however, it is compatible with distinct thermal energy storage capability and dispatchable electricity which creates a tremendous attraction. Apart from that, employing cost-effective solar selective receiver tube in a concentrating solar thermal (CST) system improves the energy conversion efficiency and directly reduces the cost of technology. In addition, the development of solar receiver tubes by low cost methods which can offer high optical properties and corrosion resistance in an open-air atmosphere would be beneficial for low and medium temperature applications. In this regard, our work opens up an approach which has the potential to achieve cost-effective energy conversion. We have developed a highly selective tandem absorber coating through a facile wet chemical route by a combination of chemical oxidation, sol-gel, and nanoparticle coating methods. The developed tandem absorber coating has gradient refractive index nature on stainless steel (SS 304) and exhibited high optical properties (α ≤ 0.95 & ε ≤ 0.14). The first absorber layer (Cr-Mn-Fe oxides) developed by controlled oxidation of SS 304 in a chemical bath reactor. A second composite layer of ZrO2-SiO2 has been applied on the chemically oxidized substrate by So-gel dip coating method to serve as optical enhancing and corrosion resistant layer. Finally, an antireflective layer (MgF2) has been deposited on the second layer, to achieve > 95% of absorption. The developed tandem layer exhibited good thermal stability up to 250 °C in open air atmospheric condition and superior corrosion resistance (withstands for > 200h in salt spray test (ASTM B117)). After the successful development of a coating with targeted properties at a laboratory scale, a prototype of the 1 m tube has been demonstrated with excellent uniformity and reproducibility. Moreover, it has been validated under standard laboratory test condition as well as in field condition with a comparison of the commercial receiver tube. The presented strategy can be widely adapted to develop highly selective coatings for a variety of CST applications ranging from hot water, solar desalination, and industrial process heat and power generation. The high-performance, cost-effective medium temperature receiver tube technology has attracted many industries, and recently the technology has been transferred to Indian industry.

Keywords: concentrated solar thermal system, solar selective coating, tandem absorber, ultralow refractive index

Procedia PDF Downloads 89
24335 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: matrix minimization algorithm, decoding sequential search algorithm, image compression, DCT, DWT

Procedia PDF Downloads 150
24334 Structuring and Visualizing Healthcare Claims Data Using Systems Architecture Methodology

Authors: Inas S. Khayal, Weiping Zhou, Jonathan Skinner

Abstract:

Healthcare delivery systems around the world are in crisis. The need to improve health outcomes while decreasing healthcare costs have led to an imminent call to action to transform the healthcare delivery system. While Bioinformatics and Biomedical Engineering have primarily focused on biological level data and biomedical technology, there is clear evidence of the importance of the delivery of care on patient outcomes. Classic singular decomposition approaches from reductionist science are not capable of explaining complex systems. Approaches and methods from systems science and systems engineering are utilized to structure healthcare delivery system data. Specifically, systems architecture is used to develop a multi-scale and multi-dimensional characterization of the healthcare delivery system, defined here as the Healthcare Delivery System Knowledge Base. This paper is the first to contribute a new method of structuring and visualizing a multi-dimensional and multi-scale healthcare delivery system using systems architecture in order to better understand healthcare delivery.

Keywords: health informatics, systems thinking, systems architecture, healthcare delivery system, data analytics

Procedia PDF Downloads 348
24333 Mourning Motivations for Celebrities in Instagram: A Case Study of Mohammadreza Shajarian's Death

Authors: Zahra Afshordi

Abstract:

Instagram, as an everyday life social network, hosts from the ultrasound image of an unborn fetus to the pictures of newly placed gravestones and funerals. It is a platform that allows its users to create a second identity independently from and at the same time in relation to the real space identity. The motives behind this identification are what this article is about. This article studies the motivations of Instagram users mourning for celebrities with a focus on the death of MohammadReza Shajarian. The Shajarian’s death had a wide reflection on Instagram Persian-speaking users. The purpose of this qualitative survey is to comprehend and study the user’s motivations in posting mourning and memorializing content. The methodology of the essay is a hybrid methodology consisting of content analysis and open-ended interviews. The results highlight that users' motives are more than just simple sympathy and include political protest, gaining cultural capital, reaching social status, and escaping from solitude.

Keywords: case study, celebrity, identity, Instagram, mourning, qualitative survey

Procedia PDF Downloads 156
24332 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 194
24331 Psychological Distress during the COVID-19 Pandemic in Nursing Students: A Mixed-Methods Study

Authors: Mayantoinette F. Watson

Abstract:

During such an unprecedented time of the largest public health crisis, the COVID-19 pandemic, nursing students are of the utmost concern regarding their psychological and physical well-being. Questions are emerging and circulating about what will happen to the nursing students and the long-term effects of the pandemic, especially now that hospitals are being overwhelmed with a significant need for nursing staff. Expectations, demands, change, and the fear of the unknown during this unprecedented time can only contribute to the many stressors that accompany nursing students through laborious clinical and didactic courses in nursing programs. The risk of psychological distress is at a maximum, and its effects can negatively impact not only nursing students but also nursing education and academia. The high exposures to interpersonal, economic, and academic demands contribute to the major health concerns, which include a potential risk for psychological distress. Achievement of educational success among nursing students is directly affected by the high exposure to anxiety and depression from experiences within the program. Working relationships and achieving academic success is imperative to positive student outcomes within the nursing program. The purpose of this study is to identify and establish influences and associations within multilevel factors, including the effects of the COVID-19 pandemic on psychological distress in nursing students. Neuman’s Systems Model Theory was used to determine nursing students’ responses to internal and external stressors. The research in this study utilized a mixed-methods, convergent study design. The study population included undergraduate nursing students from Southeastern U.S. The research surveyed a convenience sample of undergraduate nursing students. The quantitative survey was completed by 202 participants, and 11 participants participated in the qualitative follow-up interview surveys. Participants completed the Kessler Psychological Distress Scale (K6), the Perceived Stress Scale (PSS4), and the Dundee Readiness Educational Environment Scale (DREEM12) to measure psychological distress, perceived stress, and perceived educational environment. Participants also answered open-ended questions regarding their experience during the COVID-19 pandemic. Statistical tests, including bivariate analyses, multiple linear regression analyses, and binary logistics regression analyses were performed in effort to identify and highlight the effects of independent variables on the dependent variable, psychological distress. Coding and qualitative content analysis were performed to identify overarching themes within participants’ interviews. Quantitative data were sufficient in identifying correlations between psychological distress and multilevel factors of coping, marital status, COVID-19 stress, perceived stress, educational environment, and social support in nursing students. Qualitative data were sufficient in identifying common themes of students’ perceptions during COVID-19 and included online learning, workload, finances, experience, breaks, time, unknown, support, encouragement, unchanged, communication, and transmission. The findings are significant, specifically regarding contributing factors to nursing students’ psychological distress, which will help to improve learning in the academic environment.

Keywords: nursing education, nursing students, pandemic, psychological distress

Procedia PDF Downloads 87
24330 Expansion and Consolidation of Islam in Iran to the End of Qajar Period

Authors: Ashaq Hussain

Abstract:

Under Islam, for the first time since the Achaemenids, all Iranians including those of Central Asia and on the frontiers of India became united under one rule. Islam was rescued from a narrow Bedouin outlook and Bedouin mores primarily by the Iranians, who showed that Islam, both as a religion and, primarily, as a culture, need not be bound solely to the Arabic language and Arab norms of behavior. Instead Islam was to become a universal religion and culture open to all people. This was a fundamental contribution of the Iranians to Islam, although all Iranians had become Muslims by the time of the creation of Saljuq Empire. So, Iran in a sense provided the history, albeit an epic, of pre-Islamic times for Islam. After all, the Arabs conquered the entire Sasanian Empire, where they found full-scale, imperial models for the management of the new Caliphate, whereas only provinces of the Byzantine Empire were overrun by the Arabs. The present paper is an attempt to give reader a detailed introduction, emergence, expansion and spread of Islam in Iran to the end of Qajar period. It is in this context the present paper has been analyzed.

Keywords: Islam, Achaemenids, Bedouin, Central Asia, Iran

Procedia PDF Downloads 427
24329 A Human Centered Design of an Exoskeleton Using Multibody Simulation

Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann

Abstract:

Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.

Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation

Procedia PDF Downloads 162
24328 Implementing Two Rotatable Circular Polarized Glass Made Window to Reduce the Amount of Electricity Usage by Air Condition System

Authors: Imtiaz Sarwar

Abstract:

Air conditioning in homes may account for one-third of the electricity during period in summer when most of the energy is required in large cities. It is not consuming only electricity but also has a serious impact on environment including greenhouse effect. Circular polarizer filter can be used to selectively absorb or pass clockwise or counter-clock wise circularly polarized light. My research is about putting two circular polarized glasses parallel to each other and make a circular window with it. When we will place two circular polarized glasses exactly same way (0 degree to each other) then nothing will be noticed rather it will work as a regular window through which all light and heat can pass on. While we will keep rotating one of the circular polarized glasses, the angle between the glasses will keep increasing and the window will keep blocking more and more lights. It will completely block all the lights and a portion of related heat when one of the windows will reach 90 degree to another. On the other hand, we can just open the window when fresh air is necessary. It will reduce the necessity of using Air condition too much or consumer will use electric fan rather than air conditioning system. Thus, we can save a significant amount of electricity and we can go green.

Keywords: circular polarizer, window, air condition, light, energy

Procedia PDF Downloads 608
24327 Evaluation of the Nursing Management Course in Undergraduate Nursing Programs of State Universities in Turkey

Authors: Oznur Ispir, Oya Celebi Cakiroglu, Esengul Elibol, Emine Ceribas, Gizem Acikgoz, Hande Yesilbas, Merve Tarhan

Abstract:

This study was conducted to evaluate the academic staff teaching the 'Nursing Management' course in the undergraduate nursing programs of the state universities in Turkey and to assess the current content of the course. Design of the study is descriptive. Population of the study consists of seventy-eight undergraduate nursing programs in the state universities in Turkey. The questionnaire/survey prepared by the researchers was used as a data collection tool. The data were obtained by screening the content of the websites of nursing education programs between March and May 2016. Descriptive statistics were used to analyze the data. The research performed within the study indicated that 58% of the undergraduate nursing programs from which the data were derived were included in the school of health, 81% of the academic staff graduated from the undergraduate nursing programs, 40% worked as a lecturer and 37% specialized in a field other than the nursing. The research also implied that the above-mentioned course was included in 98% of the programs from which it was possible to obtain data. The full name of the course was 'Nursing Management' in 95% of the programs and 98% stated that the course was compulsory. Theory and application hours were 3.13 and 2.91, respectively. Moreover, the content of the course was not shared in 65% of the programs reviewed. This study demonstrated that the experience and expertise of the academic staff teaching the 'Nursing Management' course was not sufficient in the management area, and the schedule and content of the course were not sufficient although many nursing education programs provided the course. Comparison between the curricula of the course revealed significant differences.

Keywords: nursing, nursing management, nursing management course, undergraduate program

Procedia PDF Downloads 358