Search results for: big data interpretation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25148

Search results for: big data interpretation

20738 Adaptive Beamforming with Steering Error and Mutual Coupling between Antenna Sensors

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

Owing to close antenna spacing between antenna sensors within a compact space, a part of data in one antenna sensor would outflow to other antenna sensors when the antenna sensors in an antenna array operate simultaneously. This phenomenon is called mutual coupling effect (MCE). It has been shown that the performance of antenna array systems can be degraded when the antenna sensors are in close proximity. Especially, in a systems equipped with massive antenna sensors, the degradation of beamforming performance due to the MCE is significantly inevitable. Moreover, it has been shown that even a small angle error between the true direction angle of the desired signal and the steering angle deteriorates the effectiveness of an array beamforming system. However, the true direction vector of the desired signal may not be exactly known in some applications, e.g., the application in land mobile-cellular wireless systems. Therefore, it is worth developing robust techniques to deal with the problem due to the MCE and steering angle error for array beamforming systems. In this paper, we present an efficient technique for performing adaptive beamforming with robust capabilities against the MCE and the steering angle error. Only the data vector received by an antenna array is required by the proposed technique. By using the received array data vector, a correlation matrix is constructed to replace the original correlation matrix associated with the received array data vector. Then, the mutual coupling matrix due to the MCE on the antenna array is estimated through a recursive algorithm. An appropriate estimate of the direction angle of the desired signal can also be obtained during the recursive process. Based on the estimated mutual coupling matrix, the estimated direction angle, and the reconstructed correlation matrix, the proposed technique can effectively cure the performance degradation due to steering angle error and MCE. The novelty of the proposed technique is that the implementation procedure is very simple and the resulting adaptive beamforming performance is satisfactory. Simulation results show that the proposed technique provides much better beamforming performance without requiring complicated complexity as compared with the existing robust techniques.

Keywords: adaptive beamforming, mutual coupling effect, recursive algorithm, steering angle error

Procedia PDF Downloads 311
20737 The Quality Assurance on the Standard of Private Schools in Bangkok

Authors: Autjira Songjan, Poramatdha Chutimant

Abstract:

This research is intended to study the operational quality assurance of private schools in Bangkok according to the opinions of administrators and teachers. Second is comparing the opinions of administrators and teachers about operating quality assurance process by gender, job and work experience. The sample include administrators and teachers of private schools in the Education School in Bangkok by using a proportion random technic. The questionnaire are used as query operations quality assurance to collect the data of private schools, the statistics that are used to analyze the data using the percentage, mean, standard deviation and Test the difference value and test of variance. The research found that the administrators and teachers have different sex, positions and duties have the different opinions about quality assurance in different statistically insignificant level 0.05 in the elements of performance management and the quality of the service that provided to students in the school.

Keywords: educational quality assurance, performance management, private schools in Bangkok, quality of the service

Procedia PDF Downloads 217
20736 Towards an Environmental Knowledge System in Water Management

Authors: Mareike Dornhoefer, Madjid Fathi

Abstract:

Water supply and water quality are key problems of mankind at the moment and - due to increasing population - in the future. Management disciplines like water, environment and quality management therefore need to closely interact, to establish a high level of water quality and to guarantee water supply in all parts of the world. Groundwater remediation is one aspect in this process. From a knowledge management perspective it is only possible to solve complex ecological or environmental problems if different factors, expert knowledge of various stakeholders and formal regulations regarding water, waste or chemical management are interconnected in form of a knowledge base. In general knowledge management focuses the processes of gathering and representing existing and new knowledge in a way, which allows for inference or deduction of knowledge for e.g. a situation where a problem solution or decision support are required. A knowledge base is no sole data repository, but a key element in a knowledge based system, thus providing or allowing for inference mechanisms to deduct further knowledge from existing facts. In consequence this knowledge provides decision support. The given paper introduces an environmental knowledge system in water management. The proposed environmental knowledge system is part of a research concept called Green Knowledge Management. It applies semantic technologies or concepts such as ontology or linked open data to interconnect different data and information sources about environmental aspects, in this case, water quality, as well as background material enriching an established knowledge base. Examples for the aforementioned ecological or environmental factors threatening water quality are among others industrial pollution (e.g. leakage of chemicals), environmental changes (e.g. rise in temperature) or floods, where all kinds of waste are merged and transferred into natural water environments. Water quality is usually determined with the help of measuring different indicators (e.g. chemical or biological), which are gathered with the help of laboratory testing, continuous monitoring equipment or other measuring processes. During all of these processes data are gathered and stored in different databases. Meanwhile the knowledge base needs to be established through interconnecting data of these different data sources and enriching its semantics. Experts may add their knowledge or experiences of previous incidents or influencing factors. In consequence querying or inference mechanisms are applied for the deduction of coherence between indicators, predictive developments or environmental threats. Relevant processes or steps of action may be modeled in form of a rule based approach. Overall the environmental knowledge system supports the interconnection of information and adding semantics to create environmental knowledge about water environment, supply chain as well as quality. The proposed concept itself is a holistic approach, which links to associated disciplines like environmental and quality management. Quality indicators and quality management steps need to be considered e.g. for the process and inference layers of the environmental knowledge system, thus integrating the aforementioned management disciplines in one water management application.

Keywords: water quality, environmental knowledge system, green knowledge management, semantic technologies, quality management

Procedia PDF Downloads 209
20735 Private Coded Computation of Matrix Multiplication

Authors: Malihe Aliasgari, Yousef Nejatbakhsh

Abstract:

The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.

Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers

Procedia PDF Downloads 107
20734 Meeting the Energy Balancing Needs in a Fully Renewable European Energy System: A Stochastic Portfolio Framework

Authors: Iulia E. Falcan

Abstract:

The transition of the European power sector towards a clean, renewable energy (RE) system faces the challenge of meeting power demand in times of low wind speed and low solar radiation, at a reasonable cost. This is likely to be achieved through a combination of 1) energy storage technologies, 2) development of the cross-border power grid, 3) installed overcapacity of RE and 4) dispatchable power sources – such as biomass. This paper uses NASA; derived hourly data on weather patterns of sixteen European countries for the past twenty-five years, and load data from the European Network of Transmission System Operators-Electricity (ENTSO-E), to develop a stochastic optimization model. This model aims to understand the synergies between the four classes of technologies mentioned above and to determine the optimal configuration of the energy technologies portfolio. While this issue has been addressed before, it was done so using deterministic models that extrapolated historic data on weather patterns and power demand, as well as ignoring the risk of an unbalanced grid-risk stemming from both the supply and the demand side. This paper aims to explicitly account for the inherent uncertainty in the energy system transition. It articulates two levels of uncertainty: a) the inherent uncertainty in future weather patterns and b) the uncertainty of fully meeting power demand. The first level of uncertainty is addressed by developing probability distributions for future weather data and thus expected power output from RE technologies, rather than known future power output. The latter level of uncertainty is operationalized by introducing a Conditional Value at Risk (CVaR) constraint in the portfolio optimization problem. By setting the risk threshold at different levels – 1%, 5% and 10%, important insights are revealed regarding the synergies of the different energy technologies, i.e., the circumstances under which they behave as either complements or substitutes to each other. The paper concludes that allowing for uncertainty in expected power output - rather than extrapolating historic data - paints a more realistic picture and reveals important departures from results of deterministic models. In addition, explicitly acknowledging the risk of an unbalanced grid - and assigning it different thresholds - reveals non-linearity in the cost functions of different technology portfolio configurations. This finding has significant implications for the design of the European energy mix.

Keywords: cross-border grid extension, energy storage technologies, energy system transition, stochastic portfolio optimization

Procedia PDF Downloads 152
20733 The Consumer Behavior and the Customer Loyalty of CP Fresh Mart Consumers in Bangkok

Authors: Kanmanas Muensak, Somphoom Saweangkun

Abstract:

The objectives of this research were to study the consumer behavior that affects the customer loyalty of CP Fresh Mart in Bangkok province. The sample of the study comprised 400 consumers over 15 years old who made the purchase through CP Fresh Mart in Bangkok. The questionnaires were used as the data gathering instrument, and the data were analyzed applying Percentage, Mean, Standard Deviation, Independent Sample t-test, Two- Way ANOVA, and Least Significant Difference, and Pearson’s Correlation Coefficient also. The result of hypothesis testing showed that the respondents of different gender, age, level of education, income, marital status and occupation had differences in consumer behavior through customer loyalty of CP Fresh Mart and the factors on customer loyalty in the aspects of re-purchase, word of mouth and price sensitive, promotion, process, and personnel had positive relationship with the consumer behavior through of CP Fresh Mart in Bangkok as well as.

Keywords: consumers in Bangkok, consumer behavior, customer loyalty, CP Fresh Mart, operating budget

Procedia PDF Downloads 313
20732 Comparisons of Surveying with Terrestrial Laser Scanner and Total Station for Volume Determination of Overburden and Coal Excavations in Large Open-Pit Mine

Authors: B. Keawaram, P. Dumrongchai

Abstract:

The volume of overburden and coal excavations in open-pit mine is generally determined by conventional survey such as total station. This study aimed to evaluate the accuracy of terrestrial laser scanner (TLS) used to measure overburden and coal excavations, and to compare TLS survey data sets with the data of the total station. Results revealed that, the reference points measured with the total station showed 0.2 mm precision for both horizontal and vertical coordinates. When using TLS on the same points, the standard deviations of 4.93 cm and 0.53 cm for horizontal and vertical coordinates, respectively, were achieved. For volume measurements covering the mining areas of 79,844 m2, TLS yielded the mean difference of about 1% and the surface error margin of 6 cm at the 95% confidence level when compared to the volume obtained by total station.

Keywords: mine, survey, terrestrial laser scanner, total station

Procedia PDF Downloads 368
20731 Satellite Derived Snow Cover Status and Trends in the Indus Basin Reservoir

Authors: Muhammad Tayyab Afzal, Muhammad Arslan, Mirza Muhammad Waqar

Abstract:

Snow constitutes an important component of the cryosphere, characterized by high temporal and spatial variability. Because of the contribution of snow melt to water availability, snow is an important focus for research on climate change and adaptation. MODIS satellite data have been used to identify spatial-temporal trends in snow cover in the upper Indus basin. For this research MODIS satellite 8 day composite data of medium resolution (250m) have been analysed from 2001-2005.Pixel based supervised classification have been performed and extent of snow have been calculated of all the images. Results show large variation in snow cover between years while an increasing trend from west to east is observed. Temperature data for the Upper Indus Basin (UIB) have been analysed for seasonal and annual trends over the period 2001-2005 and calibrated with the results acquired by the research. From the analysis it is concluded that there are indications that regional warming is one of the factor that is affecting the hydrology of the upper Indus basin due to accelerated glacial melting during the simulation period, stream flow in the upper Indus basin can be predicted with a high degree of accuracy. This conclusion is also supported by the research of ICIMOD in which there is an observation that the average annual precipitation over a five year period is less than the observed stream flow and supported by positive temperature trends in all seasons.

Keywords: indus basin, MODIS, remote sensing, snow cover

Procedia PDF Downloads 372
20730 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 56
20729 ACBM: Attention-Based CNN and Bi-LSTM Model for Continuous Identity Authentication

Authors: Rui Mao, Heming Ji, Xiaoyu Wang

Abstract:

Keystroke dynamics are widely used in identity recognition. It has the advantage that the individual typing rhythm is difficult to imitate. It also supports continuous authentication through the keyboard without extra devices. The existing keystroke dynamics authentication methods based on machine learning have a drawback in supporting relatively complex scenarios with massive data. There are drawbacks to both feature extraction and model optimization in these methods. To overcome the above weakness, an authentication model of keystroke dynamics based on deep learning is proposed. The model uses feature vectors formed by keystroke content and keystroke time. It ensures efficient continuous authentication by cooperating attention mechanisms with the combination of CNN and Bi-LSTM. The model has been tested with Open Data Buffalo dataset, and the result shows that the FRR is 3.09%, FAR is 3.03%, and EER is 4.23%. This proves that the model is efficient and accurate on continuous authentication.

Keywords: keystroke dynamics, identity authentication, deep learning, CNN, LSTM

Procedia PDF Downloads 139
20728 Prevalence Of Listeria And Salmonella Contamination In Fda Recalled Foods

Authors: Oluwatofunmi Musa-Ajakaiye, Paul Olorunfemi M.D MPH, John Obafaiye

Abstract:

Introduction: The U.S Food and Drug Administration (FDA) reports the public notices for recalled FDA-regulated products over periods of time. It study reviewed the primary reasons for recalls of products of various types over a period of 7 years. Methods: The study analyzed data provided in the FDA’s archived recalls for the years 2010-2017. It identified the various reasons for product recalls in the categories of foods, beverages, drugs, medical devices, animal and veterinary products, and dietary supplements. Using SPSS version 29, descriptive statistics and chi-square analysis of the data were performed. Results (numbers, percentages, p-values, chi-square): Over the period of analysis, a total of 931 recalls were reported. The most frequent reason for recalls was undeclared products (36.7%). The analysis showed that the most recalled product type in the data set was foods and beverages, representing 591 of all recalled products (63.5%).In addition, it was observed that foods and beverages represent 77.2% of products recalled due to the presence of microorganisms. Also, a sub-group analysis of recall reasons of food and beverages found that the most prevalent reason for such recalls was undeclared products (50.1%) followed by Listeria (17.3%) then Salmonella (13.2%). Conclusion: This analysis shows that foods and beverages have the greatest percentages of total recalls due to the presence of undeclared products listeria contamination and Salmonella contamination. The prevalence of Salmonella and Listeria contamination suggests that there is a high risk of microbial contamination in FDA-approved products and further studies on the effects of such contamination must be conducted to ensure consumer safety.

Keywords: food, beverages, listeria, salmonella, FDA, contamination, microbial

Procedia PDF Downloads 51
20727 Retail Managers’ Perception on Coca-Cola Company’s Success of Glass Package Recovery and Recycling in Nairobi, Kenya

Authors: Brigitte Wabuyabo-Okonga

Abstract:

Little research has been done to establish the level of success of Coca Cola Company in recycling and reusing their glass bottles. This paper attempts to establish retail managers’ perception of the company’s self acclaimed success. Retail managers of supermarkets in the CBD of Nairobi, Kenya were considered for the study. Data were collected through questionnaires and analyzed using descriptive (mean, frequencies and percentages) and inferential statistics (correlation analysis) were used to analyze the data. The study found out that there is relative success although a lot needs to be done. For example, improving in communicating policy issues and in practice enhance the actual collection of broken and/or non-broken Coca Cola Company glass bottles through providing drop-off points in open areas such as on the streets and in parks.

Keywords: Coca Cola Company glass bottles, Kenya, Nairobi, packaging, retail manager

Procedia PDF Downloads 303
20726 Satellite LiDAR-Based Digital Terrain Model Correction using Gaussian Process Regression

Authors: Keisuke Takahata, Hiroshi Suetsugu

Abstract:

Forest height is an important parameter for forest biomass estimation, and precise elevation data is essential for accurate forest height estimation. There are several globally or nationally available digital elevation models (DEMs) like SRTM and ASTER. However, its accuracy is reported to be low particularly in mountainous areas where there are closed canopy or steep slope. Recently, space-borne LiDAR, such as the Global Ecosystem Dynamics Investigation (GEDI), have started to provide sparse but accurate ground elevation and canopy height estimates. Several studies have reported the high degree of accuracy in their elevation products on their exact footprints, while it is not clear how this sparse information can be used for wider area. In this study, we developed a digital terrain model correction algorithm by spatially interpolating the difference between existing DEMs and GEDI elevation products by using Gaussian Process (GP) regression model. The result shows that our GP-based methodology can reduce the mean bias of the elevation data from 3.7m to 0.3m when we use airborne LiDAR-derived elevation information as ground truth. Our algorithm is also capable of quantifying the elevation data uncertainty, which is critical requirement for biomass inventory. Upcoming satellite-LiDAR missions, like MOLI (Multi-footprint Observation Lidar and Imager), are expected to contribute to the more accurate digital terrain model generation.

Keywords: digital terrain model, satellite LiDAR, gaussian processes, uncertainty quantification

Procedia PDF Downloads 161
20725 Comparison of the H-Index of Researchers of Google Scholar and Scopus

Authors: Adian Fatchur Rochim, Abdul Muis, Riri Fitri Sari

Abstract:

H-index has been widely used as a performance indicator of researchers around the world especially in Indonesia. The Government uses Scopus and Google scholar as indexing references in providing recognition and appreciation. However, those two indexing services yield to different H-index values. For that purpose, this paper evaluates the difference of the H-index from those services. Researchers indexed by Webometrics, are used as reference’s data in this paper. Currently, Webometrics only uses H-index from Google Scholar. This paper observed and compared corresponding researchers’ data from Scopus to get their H-index score. Subsequently, some researchers with huge differences in score are observed in more detail on their paper’s publisher. This paper shows that the H-index of researchers in Google Scholar is approximately 2.45 times of their Scopus H-Index. Most difference exists due to the existence of uncertified publishers, which is considered in Google Scholar but not in Scopus.

Keywords: Google Scholar, H-index, Scopus, performance indicator

Procedia PDF Downloads 258
20724 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 178
20723 Power Recovery in Egyptian Natural Gas Pressure Reduction Stations Using Turboexpander Systems

Authors: Kamel A. Elshorbagy, Mohamed A. Hussein, Rola S. Afify

Abstract:

Natural gas pressure reduction is typically achieved using pressure reducing valves, where isenthalpic expansion takes place with considerable amount of wasted energy in an irreversible throttling process of the gas. Replacing gas-throttling process by an expansion process in a turbo expander (TE) converts the pressure of natural gas into mechanical energy transmitted to a loading device (i.e. an electric generator). This paper investigates the performance of a turboexpander system for power recovery at natural gas pressure reduction stations. There is a considerable temperature drop associated with the turboexpander process. Essential preheating is required, using gas fired boilers, to avoid undesirable effects of a low outlet temperature. Various system configurations were simulated by the general flow sheet simulator HYSYS and factors affecting the overall performance of the systems were investigated. Power outputs and fuel requirements were found using typical gas flow variation data. The simulation was performed for two case studies in which real input data are used. These case studies involve a domestic (commercial) and an industrial natural gas pressure reduction stations in Egypt. Economic studies of using the turboexpander system in both of the two natural gas pressure reduction stations are conducted using precise data obtained through communication with several companies working in this field. The results of economic analysis, for the two case studies, prove that using turboexpander systems in Egyptian natural gas reduction stations can be a successful project for energy conservation.

Keywords: natural gas, power recovery, reduction stations, turboexpander systems

Procedia PDF Downloads 305
20722 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications

Authors: K. P. Sandesh, M. H. Suman

Abstract:

Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.

Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms

Procedia PDF Downloads 505
20721 Developing Pandi-Tekki to Tourism Destination in Tanglang, Billiri Local Government Area, Gombe State, Nigeria

Authors: Sanusi Abubakar Sadiq

Abstract:

Despite the significance of tourism as a key revenue earner and employment generator, it is still being disregarded in many areas. The prospects of existing resources could boost development in communities; region, etc. are less used. This study is carried out with the view of developing Pandi-Tekki in Tanglang in Billiri Local Government Area as a Tourism Destination. It was primarily aimed at identifying features of Pandi-Tekki that could be developed into tourism attraction and suggest ways of developing the prospective site into a tourism destination, as well as exploring its possible contribution to tourism sector in Gombe State. Literature was reviewed based on relevant published materials. Data was collected through the use of qualitative and quantitative methods which include personal observation and structured questionnaire. Data was analyzed using the statistical package for the social sciences (SPSS) software. Result based on the data collected shows that Pandi-Tekki has potentials that can be developed as an attraction. The result also shows that the local community perceives tourism as a good development that will open them up to the entire world and also generate revenue to stimulate their economy. Conclusions were drawn based on the findings with regard to the analysis carried out in this research. It was discovered that Pandi-Tekki can be developed as a tourism destination, and there will be a great success towards achieving the aim and objectives of the development. Therefore, recommendations were made on creating awareness on the need to develop Pandi-Tekki as a Tourism Destination and the need for government to provide tourism facilities at the destination since it is a public outfit.

Keywords: attraction, destination, developing, features

Procedia PDF Downloads 270
20720 Isotretinoin and Psychiatric Adverse Events: A Review of the Evidence

Authors: Thodoris Tsagkaris, Marios Stavropoulos, Panagiotis Theodosis-Nobelos, Charalampos Triantis

Abstract:

Isotretinoin is a widely used therapeutic for the treatment of acne vulgaris and various other skin disorders. However, since its approval, many side effects and contraindications have been described, particularly important, such as teratogenicity as well as liver disease and dermal deterioration. In a very important allegation, isotretinoin has been linked with psychiatric symptoms like depression, suicidal ideation, schizophrenia, and hypervitaminosis A syndrome characteristics. These adverse effects have raised significant concerns regarding the safety of isotretinoin. Numerous studies and research have associated isotretinoin with side effects on the mental health of patients and have proposed plausible mechanisms regarding this suspected causative relationship. However, the evidence is still contradicting, and the data disperse, making their validity less valuable. Thus, in the present study, we aim to analyze further the available literature and present a complete analysis of the side effects of isotretinoin, with particular emphasis on the effects it may have on the mental health of patients. The review is based on international articles from broad scientific electronic databases like PubMed and Scopus. This review concludes that although many studies have associated isotretinoin with mental effects like depression, bipolar disorder, schizophrenia, and suicidal ideation, the data are still insufficient and often contradictory. In fact, additional studies with accurate data and larger double-blinded samples, and more analytic systematic reviews are required. It is especially important to monitor the dose and the intervals that isotretinoin has to be administered in order to potentially cause mental health problems, as well as the duration of treatment and the role that the patient's medical and pharmaceutical history may play.

Keywords: acne, depression, isotretinoin, mental health

Procedia PDF Downloads 146
20719 Predictive Analytics of Bike Sharing Rider Parameters

Authors: Bongs Lainjo

Abstract:

The evolution and escalation of bike-sharing programs (BSP) continue unabated. Since the sixties, many countries have introduced different models and strategies of BSP. These include variations ranging from dockless models to electronic real-time monitoring systems. Reasons for using this BSP include recreation, errands, work, etc. And there is all indication that complex, and more innovative rider-friendly systems are yet to be introduced. The objective of this paper is to analyze current variables established by different operators and streamline them identifying the most compelling ones using analytics. Given the contents of available databases, there is a lack of uniformity and common standard on what is required and what is not. Two factors appear to be common: user type (registered and unregistered, and duration of each trip). This article uses historical data provided by one operator based in the greater Washington, District of Columbia, USA area. Several variables including categorical and continuous data types were screened. Eight out of 18 were considered acceptable and significantly contribute to determining a useful and reliable predictive model. Bike-sharing systems have become popular in recent years all around the world. Although this trend has resulted in many studies on public cycling systems, there have been few previous studies on the factors influencing public bicycle travel behavior. A bike-sharing system is a computer-controlled system in which individuals can borrow bikes for a fee or free for a limited period. This study has identified unprecedented useful, and pragmatic parameters required in improving BSP ridership dynamics.

Keywords: sharing program, historical data, parameters, ridership dynamics, trip duration

Procedia PDF Downloads 122
20718 Teachers and Learners Perceptions on the Impact of Different Test Procedures on Reading: A Case Study

Authors: Bahloul Amel

Abstract:

The main aim of this research was to investigate the perspectives of English language teachers and learners on the effect of test techniques on reading comprehension, test performance and assessment. The research has also aimed at finding the differences between teacher and learner perspectives, specifying the test techniques which have the highest effect, investigating the other factors affecting reading comprehension, and comparing the results with the similar studies. In order to achieve these objectives, perspectives and findings of different researchers were reviewed, two different questionnaires were prepared to collect data for the perspectives of teachers and learners, the questionnaires were applied to 26 learners and 8 teachers from the University of Batna (Algeria), and quantitative and qualitative data analysis of the results were done. The results and analysis of the results show that different test techniques affect reading comprehension, test performance and assessment at different percentages rates.

Keywords: reading comprehension, reading assessment, test performance, test techniques

Procedia PDF Downloads 440
20717 In Case of Possible Disaster Management with Geographic Information System in Konya

Authors: Savaş Durduran, Ceren Yağci

Abstract:

The nature of the events going on in the world, when people’s lives are considered significantly affects natural disasters. Considering thousands of years of earth history, it is seen that many natural disasters, particularly earthquakes located in our country. Behaving cautious, without occurring hazards, after being disaster is much easier and cost effective than returning to the normal life. The four phases of disaster management in the whole world has been described as; pre-disaster preparedness and mitigation, post-disaster response and rehabilitation studies. Pre-disaster and post-disaster phases has half the weight of disaster management. How much would be prepared for disaster, no matter how disaster damage reducing work gives important, we will be less harm from material and spiritual sense. To do this in a systematic way we use the Geographic Information Systems (GIS). The execution of the emergency services to be on time and emergency control mechanism against the development the most appropriate decision Geographic Information System GIS) can be useful. The execution of the emergency services to be on time and emergency control mechanism towards for developing to be the most appropriate decision Geographic Information System (GIS) can be useful. The results obtained by using products with GIS analysis of seismic data to the city, manager of the city required information and data that can be more healthy and satisfies the appropriate policy decisions can be produced. In this study, using ArcGIS software and benefiting reports of the earthquake that occurred in the Konya city, spatial and non-spatial data consisting databases created, by the help of this database a potential disaster management aimed in the city of Konya regard to urban earthquake, GIS-aided analyzes were performed.

Keywords: geographic information systems (GIS), disaster management, emergency control mechanism, Konya

Procedia PDF Downloads 457
20716 Analysis of Universal Mobile Telecommunications Service (UMTS) Planning Using High Altitude Platform Station (HAPS)

Authors: Yosika Dian Komala, Uke Kurniawan Usman, Yuyun Siti Rohmah

Abstract:

The enable technology fills up needs of high-speed data service is Universal Mobile Telecommunications Service (UMTS). UMTS has a data rate up to 2Mbps.UMTS terrestrial system has a coverage area about 1-2km. High Altitude Platform Station (HAPS) can be built by a macro cell that is able to serve the wider area. Design method of UMTS using HAPS is planning base on coverage and capacity. The planning method is simulated with 2.8.1 Atoll’s software. Determination of radius of the cell based on the coverage uses free space loss propagation model. While the capacity planning to determine the average cell through put is available with the Offered Bit Quantity (OBQ).

Keywords: UMTS, HAPS, coverage planning, capacity planning, signal level, Ec/Io, overlapping zone, throughput

Procedia PDF Downloads 621
20715 Impact of Proposed Modal Shift from Private Users to Bus Rapid Transit System: An Indian City Case Study

Authors: Rakesh Kumar, Fatima Electricwala

Abstract:

One of the major thrusts of the Bus Rapid Transit System is to reduce the commuter’s dependency on private vehicles and increase the shares of public transport to make urban transportation system environmentally sustainable. In this study, commuter mode choice analysis is performed that examines behavioral responses to the proposed Bus Rapid Transit System (BRTS) in Surat, with estimation of the probable shift from private mode to public mode. Further, evaluation of the BRTS scenarios, using Surat’s transportation ecological footprint was done. A multi-modal simulation model was developed in Biogeme environment to explicitly consider private users behaviors and non-linear environmental impact. The data of the different factors (variables) and its impact that might cause modal shift of private mode users to proposed BRTS were collected through home-interview survey using revealed and stated preference approach. A multi modal logit model of mode-choice was then calibrated using the collected data and validated using proposed sample. From this study, a set of perception factors, with reliable and predictable data base, to explain the variation in modal shift behaviour and their impact on Surat’s ecological environment has been identified. A case study of the proposed BRTS connecting the Surat Industrial Hub to the coastal area is provided to illustrate the approach.

Keywords: BRTS, private modes, mode choice models, ecological footprint

Procedia PDF Downloads 505
20714 The Influence of Human Factors Education on the Irish Registered Pre-Hospital Practitioner within the National Ambulance Service

Authors: Desmond Wade, Alfredo Ormazabal

Abstract:

Background: Ever since it commenced its registration process of pre-hospital practitioners in the year 2000 through the Irish Government Statute Instrument (SI 109 of 2000) process, the approach to education of its professionals has changed drastically. The progression from the traditional behaviouristic to the current constructivist approach has been based on experiences from other sectors and industries, nationally and internationally. Today, the delivery of a safe and efficient ambulance service heavily depends on its practitioners’ range of technical skills, academic knowledge, and overall competences. As these increase, so does the level of complexity of paramedics’ everyday practice. This has made it inevitable to consider the 'Human Factor' as a source of potential risk and made formative institutions like the National Ambulance Service College to include it in their curriculum. Methods: This paper used a mixed-method approach, where both, an online questionnaire and a set of semi-structured interviews were the source of primary data. An analysis of this data was carried out using qualitative and quantitative data analysis. Conclusions: The evidence presented leads to the conclusion that in the National Ambulance Service there is a considerable lack of education of Human Factors and the levels in understanding of how to manage Human Factors in practice vary across its spectrum. Paramedic Practitioners in Ireland seem to understand that the responsibility of patient care lies on the team, rather than on the most hierarchically senior practitioner present in the scene.

Keywords: human factors, ergonomics, stress, decision making, pre-hospital care, paramedic, education

Procedia PDF Downloads 141
20713 Bridging the Data Gap for Sexism Detection in Twitter: A Semi-Supervised Approach

Authors: Adeep Hande, Shubham Agarwal

Abstract:

This paper presents a study on identifying sexism in online texts using various state-of-the-art deep learning models based on BERT. We experimented with different feature sets and model architectures and evaluated their performance using precision, recall, F1 score, and accuracy metrics. We also explored the use of pseudolabeling technique to improve model performance. Our experiments show that the best-performing models were based on BERT, and their multilingual model achieved an F1 score of 0.83. Furthermore, the use of pseudolabeling significantly improved the performance of the BERT-based models, with the best results achieved using the pseudolabeling technique. Our findings suggest that BERT-based models with pseudolabeling hold great promise for identifying sexism in online texts with high accuracy.

Keywords: large language models, semi-supervised learning, sexism detection, data sparsity

Procedia PDF Downloads 56
20712 The Effect of Exercise on Quality of Life in Pregnancy

Authors: Hacer Unver, Rukuye Aylaz

Abstract:

Aim: This study was conducted in order to determine the effects of exercising on quality of life in pregnancy. Material and Method: The population of the study was formed by 580 pregnants who were registered to 10 Family Health Center located in the city center of Malatya. The sample of the study, on the other hand, was formed by 230 pregnants who had minimal sample size according to known population sample size calculation. The data of this descriptive study was collected between October 2013 and September 2014 from the Family Health Centers located in the city center of Malatya. The data were collected using pregnant introductory form, exercise benefit and barrier scale, quality of life scale. Percentage distributions, t-test, Variance Analysis (ANOVA), Kruskal-Wallis, Mann-Whitney U and Pearson Correlation tests were used in the analysis of the data. Result: It was determined that 69.1% of the pregnants participating to the study did not know the benefits of exercising and 89.6% did not exercise. Quality of life mental health scores of those who exercised were determined to be higher and statistically significant (p<0.05). A positive correlation was determined between the exercise benefit scala and physical quality of life scores of the pregnants in this study (0.268, p=0.001). It was also detected that the more exercise performed led to higher total quality of life scores. Conclusion: In consequence, exercising was determined to positively affect the quality of life in pregnants. Therefore, it is recommended that nurses should give education regarding the importance and benefits of exercise during pregnancy in order to increase the quality of life.

Keywords: exercise, midwife, pregnant woman, quality of life

Procedia PDF Downloads 279
20711 Variation in Youth and Family Experiences of System of Care Principles in Community Mental Health

Authors: James D. Beauchemin

Abstract:

This study tested whether youth mental health care quality, operationalized as the extent to which youth and families experienced system-of-care principles in service interactions with providers, varied by level of youth need after adjusting for sociodemographic and treatment factors. The relationship of quality to clinical outcomes was also examined. Using administrative data and cross-sectional surveys from a stratified random sample of 1,124 caregivers of youths ages 5 to 20 within a statewide system-of-care, adjusted analyses indicated youths with the most intensive needs were significantly less likely to experience high-quality care (51% vs. 63%, p=0.016), with marked deficits on 6 of 9 items. Receipt of lower-quality care predicted less improvement in youth functioning. Despite considerable effort to develop systems-of-care for youths with the most severe mental health needs, these data suggest quality disparities remain for the most impaired youths. Policy and intervention development may be needed to improve the quality of care for this population.

Keywords: system-of-care, adherence, mental health, youth

Procedia PDF Downloads 143
20710 Comprehensive Evaluation of Thermal Environment and Its Countermeasures: A Case Study of Beijing

Authors: Yike Lamu, Jieyu Tang, Jialin Wu, Jianyun Huang

Abstract:

With the development of economy and science and technology, the urban heat island effect becomes more and more serious. Taking Beijing city as an example, this paper divides the value of each influence index of heat island intensity and establishes a mathematical model – neural network system based on the fuzzy comprehensive evaluation index of heat island effect. After data preprocessing, the algorithm of weight of each factor affecting heat island effect is generated, and the data of sex indexes affecting heat island intensity of Shenyang City and Shanghai City, Beijing, and Hangzhou City are input, and the result is automatically output by the neural network system. It is of practical significance to show the intensity of heat island effect by visual method, which is simple, intuitive and can be dynamically monitored.

Keywords: heat island effect, neural network, comprehensive evaluation, visualization

Procedia PDF Downloads 120
20709 Sea-Level Rise and Shoreline Retreat in Tainan Coast

Authors: Wen-Juinn Chen, Yi-Phei Chou, Jou-Han Wang

Abstract:

Tainan coast is suffering from beach erosion, wave overtopping, and lowland flooding; though most of the shoreline has been protected by seawalls, they still threatened by sea level rise. For coastal resources developing, coastal land utilization, and to draft an appropriate mitigate strategy. Firstly; we must assess the impact of beach erosion under a different scenario of climate change. Here, we have used the meteorological data since 1898 to 2012 to prove that the Tainan area did suffer the impact of climate change. The result shows the temperature has been raised to about 1.7 degrees since 1989. Also, we analyzed the tidal data near the Tainan coast (Anpin site and Junjunn site), it shows sea level rising with a rate about 4.1~4.8 mm/year, this phenomenon will have serious impacts on Tainan coastal area, especially it will worsen coastal erosion. So we have used Bruun rule to calculate the shoreline retreated rate at every two decade period since 2012. Wave data and bottom sand diameter D50 were used to calculate the closure depth that will be used in Bruun formula and the active length of the profile is computed by the beach slope and Dean's equilibrium concept. After analysis, we found that in 2020, the shoreline will be retreated about 3.0 to 12 meters. The maximum retreat is happening at Chigu coast. In 2060, average shoreline retreated distance is 22m, but at Chigu and Tsenwen, shoreline may be backward retreat about 70m and will be reached about 130m at 2100, this will cause a lot of coastal land loss to the sea, protect and mitigate project must be quickly performed.

Keywords: sea level rise, shoreline, coastal erosion, climate change

Procedia PDF Downloads 391