Search results for: three dimensional data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26279

Search results for: three dimensional data acquisition

22499 Influence of Free Field Vibrations Due to Vibratory Pile Driving

Authors: Shashank Mukkoti, Mainak Majumder, Srinivasan Venkatraman

Abstract:

Owing to the land scarcity in the modern-day, most of the construction activities are carried out closed to the existing buildings. Most of the high-rise buildings are constructed on pile foundations to transfer the design loads to a strong stratum below the ground surface. Due to the proximity of the new and existing structures, noise disturbances are prominent during the pile installation. Installation of vibratory piles is most suitable in urban areas. The ground vibrations developed due to the vibratory pile driving may cause many detrimental effects on the surrounding structures based on the proximity of the sources and nature of the structures. In the present study, an attempt has been made to study the severity of ground vibrations induced by vibratory pile driving. For this purpose, a three-dimensional finite element model has been developed in the ABAQUS/ Explicit finite element program. The couple finite/infinite element method has been employed for the capturing of propagating waves due to the pile installation. The geometry of the pile foundations, frequency of the pile driving, length of the pile has been considered for the parametric study. The results show that vibrations generated due to the vibratory pile installation are either very close or more than the thresholds tolerance limits set by different guidelines.

Keywords: FE model, pile driving, free field vibrations, wave propagation

Procedia PDF Downloads 267
22498 Aerodynamic Design of Three-Dimensional Bellmouth for Low-Speed Open-Circuit Wind Tunnel

Authors: Harshavardhan Reddy, Balaji Subramanian

Abstract:

A systematic parametric study to find the optimum bellmouth profile by relating geometric and performance parameters to satisfy a set of specifications is reported. A careful aerodynamic design of bellmouth intake is critical to properly direct the flow with minimal losses and maximal flow uniformity into the honeycomb located inside the settling chamber of an indraft wind tunnel, thus improving the efficiency of the entire unit. Design charts for elliptically profiled bellmouths with two different contraction ratios (9 and 18) and three different test section speeds (25 m/s, 50 m/s, and 75 m/s) were presented. A significant performance improvement - especially in the Coefficient of discharge and in the flow angularity and boundary layer thickness at the honeycomb inlet - was observed when an entry corner radius (r/D = 0.08) was added to the bellmouth profile. The nonuniformity at the honeycomb inlet drops by about three times (~1% to 0.3%) when moving from square to regular octagonal cross-section. An octagonal cross-sectioned bellmouth intake with L/d = 0.55, D/d = 1.625, and r/D = 0.08 met all the four target performance specifications and is proposed as the best choice for a low-speed wind tunnel.

Keywords: bellmouth intake, low-speed wind tunnel, coefficient of discharge, nonuniformity, flow angularity, boundary layer thickness, CFD, aerodynamics

Procedia PDF Downloads 157
22497 An Improved Image Steganography Technique Based on Least Significant Bit Insertion

Authors: Olaiya Folorunsho, Comfort Y. Daramola, Joel N. Ugwu, Lawrence B. Adewole, Olufisayo S. Ekundayo

Abstract:

In today world, there is a tremendous rise in the usage of internet due to the fact that almost all the communication and information sharing is done over the web. Conversely, there is a continuous growth of unauthorized access to confidential data. This has posed a challenge to information security expertise whose major goal is to curtail the menace. One of the approaches to secure the safety delivery of data/information to the rightful destination without any modification is steganography. Steganography is the art of hiding information inside an embedded information. This research paper aimed at designing a secured algorithm with the use of image steganographic technique that makes use of Least Significant Bit (LSB) algorithm for embedding the data into the bit map image (bmp) in order to enhance security and reliability. In the LSB approach, the basic idea is to replace the LSB of the pixels of the cover image with the Bits of the messages to be hidden without destroying the property of the cover image significantly. The system was implemented using C# programming language of Microsoft.NET framework. The performance evaluation of the proposed system was experimented by conducting a benchmarking test for analyzing the parameters like Mean Squared Error (MSE) and Peak Signal to Noise Ratio (PSNR). The result showed that image steganography performed considerably in securing data hiding and information transmission over the networks.

Keywords: steganography, image steganography, least significant bits, bit map image

Procedia PDF Downloads 241
22496 The Efects of Viable Marketing on Sustainable Development

Authors: Gabriela Tutuanu

Abstract:

The economic, social and environmental undesirable impact of the existing development pattern pushes to the adoption and use of a new development paradigm that of sustainable development. This paper intends to substantiate how the marketing can help the sustainable development. It begins with the subjects of sustainable development and sustainable marketing as they are discussed in literature. The sustainable development is a three dimensional concept which embeds the economic dimension, the social dimension and the environmental dimension that ask to have in view the simultaneous pursuit of economic prosperity, social equity and environmental quality. A major challenge to achieve these goals at business level and to integrate all three dimensions of sustainability is the sustainable marketing. The sustainable marketing is a relationship marketing that aims at building lasting relationships with the social and natural environment on a long-term thinking and futurity and this philosophy allows helping all three dimensions of sustainability. As marketing solutions that could contribute to the sustainable development. We advance the stimulation of sustainable demand, the constant innovation and improvement of sustainable products, the design and use of customized communication, a multichannel distribution network and the sale of sustainable products and services at fair prices. Their implementation will increase the economic, social and environmental sustainability at a large extent in the future if they are supported by political, governmental and legal authorities.

Keywords: sustainable development, sustainable marketing, sustainable demand, sustainable product, credible communication, multi-channel distribution network, fair price

Procedia PDF Downloads 446
22495 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 329
22494 Estimation of Source Parameters Using Source Parameters Imaging Method From Digitised High Resolution Airborne Magnetic Data of a Basement Complex

Authors: O. T. Oluriz, O. D. Akinyemi, J. A.Olowofela, O. A. Idowu, S. A. Ganiyu

Abstract:

This study was carried out using aeromagnetic data which record variation in the magnitude of the earth magnetic field in order to detect local changes in the properties of the underlying geology. The aeromagnetic data (Sheet No. 261) was acquired from the archives of Nigeria Geological Survey Agency of Nigeria, obtained in 2009. The study present estimation of source parameters within an area of about 3,025 square kilometers on geographic latitude to and longitude to within Ibadan and it’s environs in Oyo State, southwestern Nigeria. The area under study belongs to part of basement complex in southwestern Nigeria. Estimation of source parameters of aeromagnetic data was achieve through the application of source imaging parameters (SPI) techniques that provide delineation, depth, dip contact, susceptibility contrast and mineral potentials of magnetic signatures within the region. The depth to the magnetic sources in the area ranges from 0.675 km to 4.48 km. The estimated depth limit to shallow sources is 0.695 km and depth to deep sources is 4.48 km. The apparent susceptibility values of the entire study area obtained ranges from 0.01 to 0.005 [SI]. This study has shown that the magnetic susceptibility within study area is controlled mainly by super paramagnetic minerals.

Keywords: aeromagnetic, basement complex, meta-sediment, precambrian

Procedia PDF Downloads 407
22493 Orientational Pair Correlation Functions Modelling of the LiCl6H2O by the Hybrid Reverse Monte Carlo: Using an Environment Dependence Interaction Potential

Authors: Mohammed Habchi, Sidi Mohammed Mesli, Rafik Benallal, Mohammed Kotbi

Abstract:

On the basis of four partial correlation functions and some geometric constraints obtained from neutron scattering experiments, a Reverse Monte Carlo (RMC) simulation has been performed in the study of the aqueous electrolyte LiCl6H2O at the glassy state. The obtained 3-dimensional model allows computing pair radial and orientational distribution functions in order to explore the structural features of the system. Unrealistic features appeared in some coordination peaks. To remedy to this, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an additional energy constraint in addition to the usual constraints derived from experiments. The energy of the system is calculated using an Environment Dependence Interaction Potential (EDIP). Ions effects is studied by comparing correlations between water molecules in the solution and in pure water at room temperature Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in orientational distribution curves.

Keywords: LiCl6H2O, glassy state, RMC, HRMC

Procedia PDF Downloads 440
22492 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 430
22491 Auditory and Visual Perceptual Category Learning in Adults with ADHD: Implications for Learning Systems and Domain-General Factors

Authors: Yafit Gabay

Abstract:

Attention deficit hyperactivity disorder (ADHD) has been associated with both suboptimal functioning in the striatum and prefrontal cortex. Such abnormalities may impede the acquisition of perceptual categories, which are important for fundamental abilities such as object recognition and speech perception. Indeed, prior research has supported this possibility, demonstrating that children with ADHD have similar visual category learning performance as their neurotypical peers but use suboptimal learning strategies. However, much less is known about category learning processes in the auditory domain or among adults with ADHD in which prefrontal functions are more mature compared to children. Here, we investigated auditory and visual perceptual category learning in adults with ADHD and neurotypical individuals. Specifically, we examined learning of rule-based categories – presumed to be optimally learned by a frontal cortex-mediated hypothesis testing – and information-integration categories – hypothesized to be optimally learned by a striatally-mediated reinforcement learning system. Consistent with striatal and prefrontal cortical impairments observed in ADHD, our results show that across sensory modalities, both rule-based and information-integration category learning is impaired in adults with ADHD. Computational modeling analyses revealed that individuals with ADHD were slower to shift to optimal strategies than neurotypicals, regardless of category type or modality. Taken together, these results suggest that both explicit, frontally mediated and implicit, striatally mediated category learning are impaired in ADHD. These results suggest impairments across multiple learning systems in young adults with ADHD that extend across sensory modalities and likely arise from domain-general mechanisms.

Keywords: ADHD, category learning, modality, computational modeling

Procedia PDF Downloads 14
22490 Failure Analysis of Laminated Veneer Bamboo Dowel Connections

Authors: Niloufar Khoshbakht, Peggi L. Clouston, Sanjay R. Arwade, Alexander C. Schreyer

Abstract:

Laminated veneer bamboo (LVB) is a structural engineered composite made from glued layers of bamboo. A relatively new building product, LVB is currently employed in similar sizes and applications as dimensional lumber. This study describes the results of a 3D elastic Finite Element model for halfhole specimens when loaded in compression parallel-to-grain per ASTM 5764. The model simulates LVB fracture initiation due to shear stresses in the dowel joint and predicts displacement at failure validated through comparison with experimental results. The material fails at 1mm displacement due to in-plane shear stresses. The paper clarifies the complex interactive state of in-plane shear, tension perpendicular-to-grain, and compression parallel-to-grain stresses that form different distributions in the critical zone beneath the bolt hole for half-hole specimens. These findings are instrumental in understanding key factors and fundamental failure mechanisms that occur in LVB dowel connections to help devise safe standards and further LVB product adoption and design.

Keywords: composite, dowel connection, embedment strength, failure behavior, finite element analysis, Moso bamboo

Procedia PDF Downloads 245
22489 An Investigation of Differential Item and Test Functioning of Scholastic Aptitude Test 2011 (SWUSAT 2011)

Authors: Ruangdech Sirikit

Abstract:

The purposes of this study were analyzed differential item functioning and differential test functioning of SWUSAT aptitude test classification by sex variable. The data used in this research is the secondary data from Srinakharinwirot University Scholastic Aptitude Test 2011 (SWUSAT 2011) SWUSAT test consists of four subjects. There are verbal ability test, number ability test, reasoning ability test and spatial ability test. The data analysis was carried out in 2 steps. The first step was analyzing descriptive statistics. In the second step were analyzed differential item functioning (DIF) and differential test functioning (DTF) by using the DIFAS program. The research results were as follows: The results of data analysis for all 10 tests in year 2011. Sex was the characteristic that found DIF all 10 tests. The percentage of item number that found DIF was between 10% - 46.67%. There are 4 tests that most of items favors female group. There are 3 tests that most of items favors male group and there are 3 tests that the number of items favors female group equal favors male group. For Differential test functioning (DTF), there are 8 tests that have small DIF effect variance.

Keywords: differential item functioning, differential test functioning, SWUSAT, aptitude test

Procedia PDF Downloads 582
22488 Privacy Preservation Concerns and Information Disclosure on Social Networks: An Ongoing Research

Authors: Aria Teimourzadeh, Marc Favier, Samaneh Kakavand

Abstract:

The emergence of social networks has revolutionized the exchange of information. Every behavior on these platforms contributes to the generation of data known as social network data that are processed, stored and published by the social network service providers. Hence, it is vital to investigate the role of these platforms in user data by considering the privacy measures, especially when we observe the increased number of individuals and organizations engaging with the current virtual platforms without being aware that the data related to their positioning, connections and behavior is uncovered and used by third parties. Performing analytics on social network datasets may result in the disclosure of confidential information about the individuals or organizations which are the members of these virtual environments. Analyzing separate datasets can reveal private information about relationships, interests and more, especially when the datasets are analyzed jointly. Intentional breaches of privacy is the result of such analysis. Addressing these privacy concerns requires an understanding of the nature of data being accumulated and relevant data privacy regulations, as well as motivations for disclosure of personal information on social network platforms. Some significant points about how user's online information is controlled by the influence of social factors and to what extent the users are concerned about future use of their personal information by the organizations, are highlighted in this paper. Firstly, this research presents a short literature review about the structure of a network and concept of privacy in Online Social Networks. Secondly, the factors of user behavior related to privacy protection and self-disclosure on these virtual communities are presented. In other words, we seek to demonstrates the impact of identified variables on user information disclosure that could be taken into account to explain the privacy preservation of individuals on social networking platforms. Thirdly, a few research directions are discussed to address this topic for new researchers.

Keywords: information disclosure, privacy measures, privacy preservation, social network analysis, user experience

Procedia PDF Downloads 255
22487 The Current Status of Middle Class Internet Use in China: An Analysis Based on the Chinese General Social Survey 2015 Data and Semi-Structured Investigation

Authors: Abigail Qian Zhou

Abstract:

In today's China, the well-educated middle class, with stable jobs and above-average income, are the driving force behind its Internet society. Through the analysis of data from the 2015 Chinese General Social Survey and 50 interviewees, this study investigates the current situation of this group’s specific internet usage. The findings of this study demonstrate that daily life among the members of this socioeconomic group is closely tied to the Internet. For Chinese middle class, the Internet is used to socialize and entertain self and others. It is also used to search for and share information as well as to build their identities. The empirical results of this study will provide a reference, supported by factual data, for enterprises seeking to target the Chinese middle class through online marketing efforts.

Keywords: middle class, Internet use, network behaviour, online marketing, China

Procedia PDF Downloads 91
22486 Nowcasting Indonesian Economy

Authors: Ferry Kurniawan

Abstract:

In this paper, we nowcast quarterly output growth in Indonesia by exploiting higher frequency data (monthly indicators) using a mixed-frequency factor model and exploiting both quarterly and monthly data. Nowcasting quarterly GDP in Indonesia is particularly relevant for the central bank of Indonesia which set the policy rate in the monthly Board of Governors Meeting; whereby one of the important step is the assessment of the current state of the economy. Thus, having an accurate and up-to-date quarterly GDP nowcast every time new monthly information becomes available would clearly be of interest for central bank of Indonesia, for example, as the initial assessment of the current state of the economy -including nowcast- will be used as input for longer term forecast. We consider a small scale mixed-frequency factor model to produce nowcasts. In particular, we specify variables as year-on-year growth rates thus the relation between quarterly and monthly data is expressed in year-on-year growth rates. To assess the performance of the model, we compare the nowcasts with two other approaches: autoregressive model –which is often difficult when forecasting output growth- and Mixed Data Sampling (MIDAS) regression. In particular, both mixed frequency factor model and MIDAS nowcasts are produced by exploiting the same set of monthly indicators. Hence, we compare the nowcasts performance of the two approaches directly. To preview the results, we find that by exploiting monthly indicators using mixed-frequency factor model and MIDAS regression we improve the nowcast accuracy over a benchmark simple autoregressive model that uses only quarterly frequency data. However, it is not clear whether the MIDAS or mixed-frequency factor model is better. Neither set of nowcasts encompasses the other; suggesting that both nowcasts are valuable in nowcasting GDP but neither is sufficient. By combining the two individual nowcasts, we find that the nowcast combination not only increases the accuracy - relative to individual nowcasts- but also lowers the risk of the worst performance of the individual nowcasts.

Keywords: nowcasting, mixed-frequency data, factor model, nowcasts combination

Procedia PDF Downloads 310
22485 A Secure System for Handling Information from Heterogeous Sources

Authors: Shoohira Aftab, Hammad Afzal

Abstract:

Information integration is a well known procedure to provide consolidated view on sets of heterogeneous information sources. It not only provides better statistical analysis of information but also facilitates users to query without any knowledge on the underlying heterogeneous information sources The problem of providing a consolidated view of information can be handled using Semantic data (information stored in such a way that is understandable by machines and integrate-able without manual human intervention). However, integrating information using semantic web technology without any access management enforced, will results in increase of privacy and confidentiality concerns. In this research we have designed and developed a framework that would allow information from heterogeneous formats to be consolidated, thus resolving the issue of interoperability. We have also devised an access control system for defining explicit privacy constraints. We designed and applied our framework on both semantic and non-semantic data from heterogeneous resources. Our approach is validated using scenario based testing.

Keywords: information integration, semantic data, interoperability, security, access control system

Procedia PDF Downloads 321
22484 Refractive Index, Excess Molar Volume and Viscometric Study of Binary Liquid Mixture of Morpholine with Cumene at 298.15 K, 303.15 K, and 308.15 K

Authors: B. K. Gill, Himani Sharma, V. K. Rattan

Abstract:

Experimental data of refractive index, excess molar volume and viscosity of binary mixture of morpholine with cumene over the whole composition range at 298.15 K, 303.15 K, 308.15 K and normal atmospheric pressure have been measured. The experimental data were used to compute the density, deviation in molar refraction, deviation in viscosity and excess Gibbs free energy of activation as a function of composition. The experimental viscosity data have been correlated with empirical equations like Grunberg- Nissan, Herric correlation and three body McAllister’s equation. The excess thermodynamic properties were fitted to Redlich-Kister polynomial equation. The variation of these properties with composition and temperature of the binary mixtures are discussed in terms of intermolecular interactions.

Keywords: cumene, excess Gibbs free energy, excess molar volume, morpholine

Procedia PDF Downloads 307
22483 Anthropometric Data Variation within Gari-Frying Population

Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe

Abstract:

The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.

Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design

Procedia PDF Downloads 232
22482 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 113
22481 CRLH and SRR Based Microwave Filter Design Useful for Communication Applications

Authors: Subal Kar, Amitesh Kumar, A. Majumder, S. K. Ghosh, S. Saha, S. S. Sikdar, T. K. Saha

Abstract:

CRLH (composite right/left-handed) based and SRR (split-ring resonator) based filters have been designed at microwave frequency which can provide better performance compared to conventional edge-coupled band-pass filter designed around the same frequency, 2.45 GHz. Both CRLH and SRR are unit cells used in metamaterial design. The primary aim of designing filters with such structures is to realize size reduction and also to realize novel filter performance. The CRLH based filter has been designed in microstrip transmission line, while the SRR based filter is designed with SRR loading in waveguide. The CRLH based filter designed at 2.45 GHz provides an insertion loss of 1.6 dB with harmonic suppression up to 10 GHz with 67 % size reduction when compared with a conventional edge-coupled band-pass filter designed around the same frequency. One dimensional (1-D) SRR matrix loaded in a waveguide shows the possibility of realizing a stop-band with sharp skirts in the pass-band while a stop-band in the pass-band of normal rectangular waveguide with tailoring of the dimensions of SRR unit cells. Such filters are expected to be very useful for communication systems at microwave frequency.

Keywords: BPF, CRLH, harmonic, metamaterial, SRR and waveguide

Procedia PDF Downloads 406
22480 Prevalence and Risk Factors of Faecal Carriage Fluoroquinolone-Resistant Escherichia coli among Hospitalized Patients in Ado-Ekiti, Nigeria

Authors: C. A. Ologunde

Abstract:

Escherichia coli have been a major microorganisms associated with, and isolated from feacal samples either in adult or children all over the world. Strains of these organisms are resistant to cephalosporins and fluoroquinolone (FQ) antimicrobial agents among hospitalized patients and FQs are the most frequently prescribed antimicrobial class in hospitals, and the level of resistant of E. coli to these antimicrobial agents is a risk factor that should be assessed. Hence, this study was conducted to determine the prevalence and risk factors for colonization with fluoroquinolone (FQ)-resistant E. coli in hospitalized patients in Ado-Ekiti. Rectal swabs were obtained from patients in hospitals in the study area and FQ-resistant E. coli were isolated and identified by means of Nalidixic acid multi-disk and a 1-step screening procedure. Species identification and FQ resistance were confirmed by automated testing (Vitek, bioMerieux, USA). Individual colonies were subjected to pulse-field gel electrophoresis (PAGE) to determine macro-restriction polymorphism after digestion of chromosomal DNA. FQ-resistant E. coli was detected in the stool sample of 37(62%) hospitalized patient. With multivariable analyses, the use of FQ before hospitalization was the only independent risk factor for FQ-resistant E. coli carriage and was consistent for FQ exposures for the 3-12 months of study. Pulsed-field gel electrophoresis of FQ-resistant E. coli identified conal spread of 1(one) strain among 18 patients. Loss (9 patients) or acquisition (10 residents) of FQ-resistant E. coli was documented and was associated with de novo colonization with genetically distinct strains. It was concluded that FQ-resistant E. coli carriage was associated with clonal spread. The differential effects of individual fluoroquinolone on antimicrobial drug resistance are an important area for future study, as hospitals manipulate their formularies with regard to use of individual fluoroquinolone, often for economic reasons.

Keywords: E. coli, fluoroquinolone, risk factors, feacal carriage, hospitalized patients, Ado-Ekiti

Procedia PDF Downloads 207
22479 The Development of Research Based Model to Enhance Critical Thinking, Cognitive Skills and Culture and Local Wisdom Knowledge of Undergraduate Students

Authors: Nithipattara Balsiri

Abstract:

The purposes of this research was to develop instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge of undergraduate students. The sample consisted of 307 undergraduate students. Critical thinking and cognitive skills test were employed for data collection. Second-order confirmatory factor analysis, t-test, and one-way analysis of variance were employed for data analysis using SPSS and LISREL programs. The major research results were as follows; 1) the instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge should be consists of 6 sequential steps, namely (1) the setting research problem (2) the setting research hypothesis (3) the data collection (4) the data analysis (5) the research result conclusion (6) the application for problem solving, and 2) after the treatment undergraduate students possessed a higher scores in critical thinking and cognitive skills than before treatment at the 0.05 level of significance.

Keywords: critical thinking, cognitive skills, culture and local wisdom knowledge

Procedia PDF Downloads 334
22478 A Case Study of Control of Blast-Induced Ground Vibration on Adjacent Structures

Authors: H. Mahdavinezhad, M. Labbaf, H. R. Tavakoli

Abstract:

In recent decades, the study and control of the destructive effects of explosive vibration in construction projects has received more attention, and several experimental equations in the field of vibration prediction as well as allowable vibration limit for various structures are presented. Researchers have developed a number of experimental equations to estimate the peak particle velocity (PPV), in which the experimental constants must be obtained at the site of the explosion by fitting the data from experimental explosions. In this study, the most important of these equations was evaluated for strong massive conglomerates around Dez Dam by collecting data on explosions, including 30 particle velocities, 27 displacements, 27 vibration frequencies and 27 acceleration of earth vibration at different distances; they were recorded in the form of two types of detonation systems, NUNEL and electric. Analysis showed that the data from the explosion had the best correlation with the cube root of the explosive, R2=0.8636, but overall the correlation coefficients are not much different. To estimate the vibration in this project, data regression was performed in the other formats, which resulted in the presentation of new equation with R2=0.904 correlation coefficient. Finally according to the importance of the studied structures in order to ensure maximum non damage to adjacent structures for each diagram, a range of application was defined so that for distances 0 to 70 meters from blast site, exponent n=0.33 and for distances more than 70 m, n =0.66 was suggested.

Keywords: blasting, blast-induced vibration, empirical equations, PPV, tunnel

Procedia PDF Downloads 101
22477 Development of a System for Fitting Clothes and Accessories Using Augmented Reality

Authors: Dinmukhamed T., Vassiliy S.

Abstract:

This article suggests the idea of fitting clothes and accessories based on augmented reality. A logical data model has been developed, taking into account the decision-making module (colors, style, type, material, popularity, etc.) based on personal data (age, gender, weight, height, leg size, hoist length, geolocation, photogrammetry, number of purchases of certain types of clothing, etc.) and statistical data of the purchase history (number of items, price, size, color, style, etc.). Also, in order to provide information to the user, it is planned to develop an augmented reality system using a QR code. This system of selection and fitting of clothing and accessories based on augmented reality will be used in stores to reduce the time for the buyer to make a decision on the choice of clothes.

Keywords: augmented reality, online store, decision-making module, like QR code, clothing store, queue

Procedia PDF Downloads 124
22476 On the Paradigm Shift of the Overall Urban Design in China

Authors: Gaoyuan Wang, Tian Chen, Junnan Liu

Abstract:

Facing a period of major change that’s rarely seen in a century, China formulates the 14th Five-Year Plan and places emphasis on promoting high-quality development. In this context, the overall urban design has become a crucial and systematic tool for high-quality urban development. However, there are bottlenecks in the nature definition, content scope and transmission mechanisms of the current overall urban design in China. The paper interprets the emerging demands of the 14th Five-Year Plan on urban design in terms of new value-quality priority, new dynamic-space performance, new target-region coordination and new path-refined governance. Based on the new trend and appeal, the multi-dimensional thinking integrated with the major tasks of urban design are proposed accordingly, which is the biomass thinking in ecological, production and living element, the strategic thinking in spatial structure, the systematic thinking in the cityscape, the low-carbon thinking in urban form, the governance thinking in public space, the user thinking in design implementation. The paper explores the possibility of transforming the value thinking and technical system of urban design in China and provides a breakthrough path for the urban planning and design industry to better respond to the propositions of the country’s 14th Five-Year Plan.

Keywords: China’s 14th five-year plan, overall urban design, urban design thinking, transformation of urban design

Procedia PDF Downloads 232
22475 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 34
22474 Blockchain for Transport: Performance Simulations of Blockchain Network for Emission Monitoring Scenario

Authors: Dermot O'Brien, Vasileios Christaras, Georgios Fontaras, Igor Nai Fovino, Ioannis Kounelis

Abstract:

With the rise of the Internet of Things (IoT), 5G, and blockchain (BC) technologies, vehicles are becoming ever increasingly connected and are already transmitting substantial amounts of data to the original equipment manufacturers (OEMs) servers. This data could be used to help detect mileage fraud and enable more accurate vehicle emissions monitoring. This would not only help regulators but could enable applications such as permitting efficient drivers to pay less tax, geofencing for air quality improvement, as well as pollution tolling and trading platforms for transport-related businesses and EU citizens. Other applications could include traffic management and shared mobility systems. BC enables the transmission of data with additional security and removes single points of failure while maintaining data provenance, identity ownership, and the possibility to retain varying levels of privacy depending on the requirements of the applied use case. This research performs simulations of vehicles interacting with European member state authorities and European Commission BC nodes that are running hyperleger fabric and explores whether the technology is currently feasible for transport applications such as the emission monitoring use-case.

Keywords: future transportation systems, technological innovations, policy approaches for transportation future, economic and regulatory trends, blockchain

Procedia PDF Downloads 145
22473 DURAFILE: A Collaborative Tool for Preserving Digital Media Files

Authors: Santiago Macho, Miquel Montaner, Raivo Ruusalepp, Ferran Candela, Xavier Tarres, Rando Rostok

Abstract:

During our lives, we generate a lot of personal information such as photos, music, text documents and videos that link us with our past. This data that used to be tangible is now digital information stored in our computers, which implies a software dependence to make them accessible in the future. Technology, however, constantly evolves and goes through regular shifts, quickly rendering various file formats obsolete. The need for accessing data in the future affects not only personal users but also organizations. In a digital environment, a reliable preservation plan and the ability to adapt to fast changing technology are essential for maintaining data collections in the long term. We present in this paper the European FP7 project called DURAFILE that provides the technology to preserve media files for personal users and organizations while maintaining their quality.

Keywords: artificial intelligence, digital preservation, social search, digital preservation plans

Procedia PDF Downloads 424
22472 Constructing a Semi-Supervised Model for Network Intrusion Detection

Authors: Tigabu Dagne Akal

Abstract:

While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.

Keywords: intrusion detection, data mining, computer science, data mining

Procedia PDF Downloads 270
22471 Academic Leadership Succession Planning Practice in Nigeria Higher Education Institutions: A Case Study of Colleges of Education

Authors: Adie, Julius Undiukeye

Abstract:

This research investigated the practice of academic leadership succession planning in Nigerian higher education institutions, drawing on the lived experiences of the academic staff of the case study institutions. It is multi-case study research that adopts a qualitative research method. Ten participants (mainly academic staff) were used as the study sample. The study was guided by four research questions. Semi-structured interviews and archival information from official documents formed the sources of data. The data collected was analyzed using the Constant Comparative Technique (CCT) to generate empirical insights and facts on the subject of this paper. The following findings emerged from the data analysis: firstly, there was no formalized leadership succession plan in place in the institutions that were sampled for this study; secondly, despite the absence of a formal succession plan, the data indicates that academics believe that succession planning is very significant for institutional survival; thirdly, existing practices of succession planning in the sampled institutions, takes the forms of job seniority ranking, political process and executive fiat, ad-hoc arrangement, and external hiring; and finally, data revealed that there are some barriers to the practice of succession planning, such as traditional higher education institutions’ characteristics (e.g. external talent search, shared governance, diversity, and equality in leadership appointment) and the lack of interest in leadership positions. Based on the research findings, some far-reaching recommendations were made, including the urgent need for the ‘formalization’ of leadership succession planning by the higher education institutions concerned, through the design of an official policy framework.

Keywords: academic leadership, succession, planning, higher education

Procedia PDF Downloads 112
22470 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: ’Reddit’

Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell

Abstract:

Native language identification is one of the growing subfields in natural language processing (NLP). The task of native language identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features, when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL), and then the trained models are evaluated on different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and logistic regression. Results show that content-based features are more accurate and robust than content independent ones when tested within the corpus and across corpus.

Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML

Procedia PDF Downloads 106