Search results for: internet data centre
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26253

Search results for: internet data centre

24723 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 48
24722 The Consequences of Regime Change in Iraq; Formation and Continuation of Geopolitical Crises

Authors: Ali Asghar Sotoudeh

Abstract:

Since the US invasion of Iraq in 2003 and the subsequent regime change, internal conflicts between political and ethnic-religious groups have become a hallmark of Iraqi political dynamism. The most important manifestations of these conflicts are the Kurdish-central government conflicts, as well as fundamentalism since 2003. As a result, it seems not only US presence in Iraq under the pretext of fighting terrorism and expanding democracy has not had a positive effect on controlling fundamentalism and political stability in Iraq, but it has paved the way for the formation and continuation of geopolitical crises in the form of disputes over territory and sources of power. In this regard, given the importance of the study, the main purpose of this study is to examine the process of the impact of US regime-change policy on the formation and continuation of geopolitical crises in Iraq. The central question of this study is, what effect has the US regime change policy had on Iraq's domestic political processes? Findings show that regime change and subsequent imposed federalism have widened the gaps in Iraq's sectarian-ethnic system. As a result, the geopolitical crisis in the context of the dispute over geographical territory and sources of power between ethnic-religious groups has become the most important political dynamic in Iraq since the occupation. The research method in this article is descriptive-analytical, and the data collection method is library and internet resources.

Keywords: Iraq, united states, geopolitical crisis, ethno-religious conflict, political federalism

Procedia PDF Downloads 142
24721 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 477
24720 The Effects of Scientific Studies on the Future Fashion Trends

Authors: Basak Ozkendirci

Abstract:

The discovery of chemical dyes, the development of regenerated fibers, and warp knitting technology have enormous effects on the fashion world. The trends created by the information obtained in the context of various studies today shape the fashion world. Trend analysts must follow scientific developments as well as sociological events, political developments and artwork to obtain healthy data on trends. Digital printing technologies have changed the dynamics of textile printing production and also the style of printed designs. Fashion designers already have started design 3D printed accessories and garments. The research fields like the internet of things, artificial intelligence, hologram technologies, mechatronics, energy storage systems, nanotechnology are seen as the technologies that will change the social life and economy of the future. It is clear that research carried out in these areas will affect the textiles of the future and whereat the trends of fashion. The article aims to create a future vision for trend researchers and designers by giving clues about the changes to be experienced in the fashion world. In the first part of the article, information about the scientific studies that are thought to shape the future is given, and the forecasting about how the inventions that can be obtained from these studies can be adapted at the textile are presented. In the second part of the article, examples of how the new generation of innovative textiles will affect the daily life experience of the user are given.

Keywords: biotextiles, fashion trends, nanotextiles, new materials, smart textiles, techno textiles

Procedia PDF Downloads 333
24719 Early Formation of Adipocere in Subtropical Climate

Authors: Asit K. Sikary, O. P. Murty

Abstract:

Adipocere formation is a modification of the process of putrefaction. It consists mainly of saturated fatty acids, formed by the post-mortem hydrolysis and hydrogenation of body fats with the help of bacterial enzymes in the presence of warmth, moisture and anaerobic bacteria. In temperate climate, it takes weeks to develop while in India it starts to begin within 4-5 days. In this study, we have collected cases with adipocere formation, which were from the South Delhi region (average room temperature 27-390C) and autopsied at our centre. Details of the circumstances of the death, cause and time of death, surrounding environment and demographic profile of the deceased were taken into account. Total 16 cases were included in this study. Adipocere formation was predominantly present over cheeks, shoulder, breast, flanks, buttocks, and thighs. Out of 16, 11 cases were found in a dry atmosphere, 5 cases were brought from the water. There were 5 cases in which adipocere formation was seen in less than 2 days, and among them, in 1 case, as early as one day. This study showed that adipocere formation can be seen as early as 1 day in a hot and humid environment.

Keywords: adipocere, drowning, hanging, humid environment, strangulation, subtropical climate

Procedia PDF Downloads 416
24718 Digital Twins in the Built Environment: A Systematic Literature Review

Authors: Bagireanu Astrid, Bros-Williamson Julio, Duncheva Mila, Currie John

Abstract:

Digital Twins (DT) are an innovative concept of cyber-physical integration of data between an asset and its virtual replica. They have originated in established industries such as manufacturing and aviation and have garnered increasing attention as a potentially transformative technology within the built environment. With the potential to support decision-making, real-time simulations, forecasting abilities and managing operations, DT do not fall under a singular scope. This makes defining and leveraging the potential uses of DT a potential missed opportunity. Despite its recognised potential in established industries, literature on DT in the built environment remains limited. Inadequate attention has been given to the implementation of DT in construction projects, as opposed to its operational stage applications. Additionally, the absence of a standardised definition has resulted in inconsistent interpretations of DT in both industry and academia. There is a need to consolidate research to foster a unified understanding of the DT. Such consolidation is indispensable to ensure that future research is undertaken with a solid foundation. This paper aims to present a comprehensive systematic literature review on the role of DT in the built environment. To accomplish this objective, a review and thematic analysis was conducted, encompassing relevant papers from the last five years. The identified papers are categorised based on their specific areas of focus, and the content of these papers was translated into a through classification of DT. In characterising DT and the associated data processes identified, this systematic literature review has identified 6 DT opportunities specifically relevant to the built environment: Facilitating collaborative procurement methods, Supporting net-zero and decarbonization goals, Supporting Modern Methods of Construction (MMC) and off-site manufacturing (OSM), Providing increased transparency and stakeholders collaboration, Supporting complex decision making (real-time simulations and forecasting abilities) and Seamless integration with Internet of Things (IoT), data analytics and other DT. Finally, a discussion of each area of research is provided. A table of definitions of DT across the reviewed literature is provided, seeking to delineate the current state of DT implementation in the built environment context. Gaps in knowledge are identified, as well as research challenges and opportunities for further advancements in the implementation of DT within the built environment. This paper critically assesses the existing literature to identify the potential of DT applications, aiming to harness the transformative capabilities of data in the built environment. By fostering a unified comprehension of DT, this paper contributes to advancing the effective adoption and utilisation of this technology, accelerating progress towards the realisation of smart cities, decarbonisation, and other envisioned roles for DT in the construction domain.

Keywords: built environment, design, digital twins, literature review

Procedia PDF Downloads 74
24717 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 383
24716 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management

Authors: Kenneth Harper

Abstract:

The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.

Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups

Procedia PDF Downloads 12
24715 Evolution of Antimicrobial Resistance in Shigella since the Turn of 21st Century, India

Authors: Neelam Taneja, Abhishek Mewara, Ajay Kumar

Abstract:

Multidrug resistant shigellae have emerged as a therapeutic challenge in India. At our 2000 bed tertiary care referral centre in Chandigarh, North India, which caters to a large population of 7 neighboring states, antibiotic resistance in Shigella is being constantly monitored. Shigellae are isolated from 3 to 5% of all stool samples. In 1990 nalidixic acid was the drug of choice as 82%, and 63% of shigellae were resistant to ampicillin and cotrimoxazole respectively. Nalidixic acid resistance emerged in 1992 and rapidly increased from 6% during 1994-98 to 86% by the turn of 21st century. In the 1990s, the WHO recommended ciprofloxacin as the drug of choice for empiric treatment of shigellosis in view of the existing high level resistance to agents like chloramphenicol, ampicillin, cotrimoxazole and nalidixic acid. First resistance to ciprofloxacin in S. flexneri at our centre appeared in 2000 and rapidly rose to 46% in 2007 (MIC>4mg/L). In between we had an outbreak of ciprofloxacin resistant S.dysenteriae serotype 1 in 2003. Therapeutic failures with ciprofloxacin occurred with both ciprofloxacin-resistant S. dysenteriae and ciprofloxacin-resistant S. flexneri. The severity of illness was more with ciprofloxacin-resistant strains. Till 2000, elsewhere in the world ciprofloxacin resistance in S. flexneri was sporadic and uncommon, though resistance to co-trimoxazole and ampicillin was common and in some areas resistance to nalidixic acid had also emerged. Fluoroquinolones due to extensive use and misuse for many other illnesses in our region are thus no longer the preferred group of drugs for managing shigellosis in India. WHO presently recommends ceftriaxone and azithromycin as alternative drugs to fluoroquinolone-resistant shigellae, however, overreliance on this group of drugs also seems to soon become questionable considering the emerging cephalosporin-resistant shigellae. We found 15.1% of S. flexneri isolates collected over a period of 9 years (2000-2009) resistant to at least one of the third-generation cephalosporins (ceftriaxone/cefotaxime). The first isolate showing ceftriaxone resistance was obtained in 2001, and we have observed an increase in number of isolates resistant to third generation cephalosporins in S. flexneri 2005 onwards. This situation has now become a therapeutic challenge in our region. The MIC values for Shigella isolates revealed a worrisome rise for ceftriaxone (MIC90:12 mg/L) and cefepime (MIC90:8 mg/L). MIC values for S. dysenteriae remained below 1 mg/L for ceftriaxone, however for cefepime, the MIC90 has raised to 4 mg/L. These infections caused by ceftriaxone-resistant S. flexneri isolates were successfully treated by azithromycin at our center. Most worrisome development in the present has been the emergence of DSA(Decreased susceptibility to azithromycin) which surfaced in 2001 and has increased from 4.3% till 2011 to 34% thereafter. We suspect plasmid-mediated resistance as we detected qnrS1-positive Shigella for the first time from the Indian subcontinent in 2 strains from 2010, indicating a relatively new appearance of this PMQR determinant among Shigella in India. This calls for a continuous and strong surveillance of antibiotic resistance across the country. The prevention of shigellosis by developing cost-effective vaccines is desirable as it will substantially reduce the morbidity associated with diarrhoea in the country

Keywords: Shigella, antimicrobial, resistance, India

Procedia PDF Downloads 228
24714 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 508
24713 A Practical Methodology for Evaluating Water, Sanitation and Hygiene Education and Training Programs

Authors: Brittany E. Coff, Tommy K. K. Ngai, Laura A. S. MacDonald

Abstract:

Many organizations in the Water, Sanitation and Hygiene (WASH) sector provide education and training in order to increase the effectiveness of their WASH interventions. A key challenge for these organizations is measuring how well their education and training activities contribute to WASH improvements. It is crucial for implementers to understand the returns of their education and training activities so that they can improve and make better progress toward the desired outcomes. This paper presents information on CAWST’s development and piloting of the evaluation methodology. The Centre for Affordable Water and Sanitation Technology (CAWST) has developed a methodology for evaluating education and training activities, so that organizations can understand the effectiveness of their WASH activities and improve accordingly. CAWST developed this methodology through a series of research partnerships, followed by staged field pilots in Nepal, Peru, Ethiopia and Haiti. During the research partnerships, CAWST collaborated with universities in the UK and Canada to: review a range of available evaluation frameworks, investigate existing practices for evaluating education activities, and develop a draft methodology for evaluating education programs. The draft methodology was then piloted in three separate studies to evaluate CAWST’s, and CAWST’s partner’s, WASH education programs. Each of the pilot studies evaluated education programs in different locations, with different objectives, and at different times within the project cycles. The evaluations in Nepal and Peru were conducted in 2013 and investigated the outcomes and impacts of CAWST’s WASH education services in those countries over the past 5-10 years. In 2014, the methodology was applied to complete a rigorous evaluation of a 3-day WASH Awareness training program in Ethiopia, one year after the training had occurred. In 2015, the methodology was applied in Haiti to complete a rapid assessment of a Community Health Promotion program, which informed the development of an improved training program. After each pilot evaluation, the methodology was reviewed and improvements were made. A key concept within the methodology is that in order for training activities to lead to improved WASH practices at the community level, it is not enough for participants to acquire new knowledge and skills; they must also apply the new skills and influence the behavior of others following the training. The steps of the methodology include: development of a Theory of Change for the education program, application of the Kirkpatrick model to develop indicators, development of data collection tools, data collection, data analysis and interpretation, and use of the findings for improvement. The methodology was applied in different ways for each pilot and was found to be practical to apply and adapt to meet the needs of each case. It was useful in gathering specific information on the outcomes of the education and training activities, and in developing recommendations for program improvement. Based on the results of the pilot studies, CAWST is developing a set of support materials to enable other WASH implementers to apply the methodology. By using this methodology, more WASH organizations will be able to understand the outcomes and impacts of their training activities, leading to higher quality education programs and improved WASH outcomes.

Keywords: education and training, capacity building, evaluation, water and sanitation

Procedia PDF Downloads 303
24712 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 370
24711 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 90
24710 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance

Procedia PDF Downloads 483
24709 Study on Quality of Life among Patients Undergoing Hemodialysis in National Kidney Centre, Banasthali, Kathmandu

Authors: Tara Gurung, Suprina Prajapati

Abstract:

Health and well being of people is a crucial for accomplishing sustainable development goals of any country. The present study focuses on quality of life of patients undergoing hemodialysis. Hemodialysis is a life sustaining treatment for patients with end stage renal disease (ESRD). Hemodialysis can bring about significant impairment in health related quality of life (HRQOL). The purpose of this study was to assess the quality of life of hemodialysis patients undergoing hemodialysis. A descriptive cross-sectional research design was utilized in total 100 samples using random sampling technique. The findings revealed that the total quality of life of the patients was 30.41±3.99 out of 100. The total physical component score was statistically significant with education status of the patients where p value for t test was 0.03 (p=0.03) and occupation of the patients where p value for the ANOVA test was 0.007 (p=0.007). The study recommended that it would be better if awareness programs regarding chronic kidney disease and life style modification in hemodialysis patients is given to the patients so that it would help patients to maintain the HRQOL.

Keywords: health and well bing, hemodialysis, patients quality of life

Procedia PDF Downloads 138
24708 Identification of microRNAs in Early and Late Onset of Parkinson’s Disease Patient

Authors: Ahmad Rasyadan Arshad, A. Rahman A. Jamal, N. Mohamed Ibrahim, Nor Azian Abdul Murad

Abstract:

Introduction: Parkinson’s disease (PD) is a complex and asymptomatic disease where patients are usually diagnosed at late stage where about 70% of the dopaminergic neurons are lost. Therefore, identification of molecular biomarkers is crucial for early diagnosis of PD. MicroRNA (miRNA) is a short nucleotide non-coding small RNA which regulates the gene expression in post-translational process. The involvement of these miRNAs in neurodegenerative diseases includes maintenance of neuronal development, necrosis, mitochondrial dysfunction and oxidative stress. Thus, miRNA could be a potential biomarkers for diagnosis of PD. Objective: This study aim to identify the miRNA involved in Late Onset PD (LOPD) and Early Onset PD (EOPD) compared to the controls. Methods: This is a case-control study involved PD patients in the Chancellor Tunku Muhriz Hospital at the UKM Medical Centre. miRNA samples were extracted using miRNeasy serum/plasma kit from Qiagen. The quality of miRNA extracted was determined using Agilent RNA 6000 Nano kit in the Bioanalyzer. miRNA expression was performed using GeneChip miRNA 4.0 chip from Affymetrix. Microarray was performed in EOPD (n= 7), LOPD (n=9) and healthy control (n=11). Expression Console and Transcriptomic Analyses Console were used to analyze the microarray data. Result: miR-129-5p was significantly downregulated in EOPD compared to LOPD with -4.2 fold change (p = <0.050. miR-301a-3p was upregulated in EOPD compared to healthy control (fold = 10.3, p = <0.05). In LOPD versus healthy control, miR-486-3p (fold = 15.28, p = <0.05), miR-29c-3p (fold = 12.21, p = <0.05) and miR-301a-3p (fold = 10.01, p =< 0.05) were upregulated. Conclusion: Several miRNA have been identified to be differentially expressed in EOPD compared to LOPD and PD versus control. These miRNAs could serve as the potential biomarkers for early diagnosis of PD. However, these miRNAs need to be validated in a larger sample size.

Keywords: early onset PD, late onset PD, microRNA (miRNA), microarray

Procedia PDF Downloads 256
24707 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 238
24706 Influence of Information and Communication Technology on Dress Culture among Senior Secondary School Students in Ife East Local Government, Osun State, Nigeria

Authors: Idowu J. Diyaolu, Ebenezer O. Obayomi, Taiwo A. Bamidele

Abstract:

Information and Communication Technology (ICT) has been observed to have influence on the lifestyle of youths in general. Dressing styles, fashion consciousness and choice of role model are some of the areas of influence. The study was carried out to examine the perception and influence of ICT on the clothing culture of selected Senior Secondary School Students in Ife-East Local government area of Osun State, Nigeria. Two hundred Senior Secondary School Students from public and private schools were randomly selected. Data was collected using structured questionnaire. The result showed that 79.0% were computer literate, 64.5% have facebook account and 93.5% browse with phones. Based on their perception on the influence of ICT, 74.5% of the respondents agreed that frequent use of ICT has increased their level of fashion consciousness while 60.5% were motivated by the images and dressing pattern in magazines, on TV and the internet. Also, large proportions (60.5%) were influenced by the dressing styles of their friends on social media. Male students were significantly more engaged in ICT related activities than females (t = 1.29, P < 0.05), whereas there is no significant difference in the involvement in ICT activities between private and public school students (t = 0.325, P > 0.05). Since ICT has influence on dressing, appropriate dressing pattern should be encouraged on mass media.

Keywords: dress culture, information and communication technology, fashion trend, role model

Procedia PDF Downloads 456
24705 Hyper-Immunoglobulin E (Hyper-Ige) Syndrome In Skin Of Color: A Retrospective Single-Centre Observational Study

Authors: Rohit Kothari, Muneer Mohamed, Vivekanandh K., Sunmeet Sandhu, Preema Sinha, Anuj Bhatnagar

Abstract:

Introduction: Hyper-IgE syndrome is a rare primary immunodeficiency syndrome characterised by triad of severe atopic dermatitis, recurrent pulmonary infections, and recurrent staphylococcal skin infections. The diagnosis requires a high degree of suspicion, typical clinical features, and not mere rise in serum-IgE levels, which may be seen in multiple conditions. Genetic studies are not always possible in a resource poor setting. This study highlights various presentations of Hyper-IgE syndrome in skin of color children. Case-series: Our study had six children of Hyper-IgE syndrome aged twomonths to tenyears. All had onset in first ten months of life except one with a late-onset at two years. All had recurrent eczematoid rash, which responded poorly to conventional treatment, secondary infection, multiple episodes of hospitalisation for pulmonary infection, and raised serum IgE levels. One case had occasional vesicles, bullae, and crusted plaques over both the extremities. Genetic study was possible in only one of them who was found to have pathogenic homozygous deletions of exon-15 to 18 in DOCK8 gene following which he underwent bone marrow transplant (BMT), however, succumbed to lower respiratory tract infection two months after BMT and rest of them received multiple courses of antibiotics, oral/ topical steroids, and cyclosporine intermittently with variable response. Discussion: Our study highlights various characteristics, presentation, and management of this rare syndrome in children. Knowledge of these manifestations in skin of color will facilitate early identification and contribute to optimal care of the patients as representative data on the same is limited in literature.

Keywords: absolute eosinophil count, atopic dermatitis, eczematous rash, hyper-immunoglobulin E syndrome, pulmonary infection, serum IgE, skin of color

Procedia PDF Downloads 132
24704 Investigating the Impacts of Climate Change on Soil Erosion: A Case Study of Kasilian Watershed, Northern Iran

Authors: Mohammad Zare, Mahbubeh Sheikh

Abstract:

Many of the impact of climate change will material through change in soil erosion which were rarely addressed in Iran. This paper presents an investigation of the impacts of climate change soil erosin for the Kasilian basin. LARS-WG5 was used to downscale the IPCM4 and GFCM21 predictions of the A2 scenarios for the projected periods of 1985-2030 and 2080-2099. This analysis was carried out by means of the dataset the International Centre for Theoretical Physics (ICTP) of Trieste. Soil loss modeling using Revised Universal Soil Loss Equation (RUSLE). Results indicate that soil erosion increase or decrease, depending on which climate scenarios are considered. The potential for climate change to increase soil loss rate, soil erosion in future periods was established, whereas considerable decreases in erosion are projected when land use is increased from baseline periods.

Keywords: Kasilian watershed, climatic change, soil erosion, LARS-WG5 Model, RUSLE

Procedia PDF Downloads 501
24703 User Acceptance Criteria for Digital Libraries

Authors: Yu-Ming Wang, Jia-Hong Jian

Abstract:

The Internet and digital publication technologies have brought dramatic impacts on how people collect, organize, disseminate, access, store, and use information. More and more governments, schools, and organizations spent huge funds to develop digital libraries. A digital library can be regarded as a web extension of traditional physically libraries. People can search diverse publications, find out the position of knowledge resources, and borrow or buy publications through digital libraries. People can gain knowledge and students or employees can finish their reports by using digital libraries. Since the considerable funds and energy have been invested in implementing digital libraries, it is important to understand the evaluative criteria from the users’ viewpoint in order to enhance user acceptance. This study develops a list of user acceptance criteria for digital libraries. An initial criteria list was developed based on some previously validated instruments related to digital libraries. Data were collected from user experiences of digital libraries. The exploratory factor analysis and confirmatory factor analysis were adopted to purify the criteria list. The reliabilities and validities were tested. After validating the criteria list, a user survey was conducted to collect the comparative importance of criteria. The analytic hierarchy process (AHP) method was utilized to derive the importance of each criterion. The results of this study contribute to an e understanding of the criteria and relative importance that users evaluate for digital libraries.

Keywords: digital library, user acceptance, analytic hierarchy process, factor analysis

Procedia PDF Downloads 250
24702 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 475
24701 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 299
24700 Enhancing Cloud Computing with Security Trust Model

Authors: John Ayoade

Abstract:

Cloud computing is a model that enables the delivery of on-demand computing resources such as networks, servers, storage, applications and services over the internet. Cloud Computing is a relatively growing concept that presents a good number of benefits for its users; however, it also raises some security challenges which may slow down its use. In this paper, we identify some of those security issues that can serve as barriers to realizing the full benefits that cloud computing can bring. One of the key security problems is security trust. A security trust model is proposed that can enhance the confidence that users need to fully trust the use of public and mobile cloud computing and maximize the potential benefits that they offer.

Keywords: cloud computing, trust, security, certificate authority, PKI

Procedia PDF Downloads 479
24699 Biomedical Definition Extraction Using Machine Learning with Synonymous Feature

Authors: Jian Qu, Akira Shimazu

Abstract:

OOV (Out Of Vocabulary) terms are terms that cannot be found in many dictionaries. Although it is possible to translate such OOV terms, the translations do not provide any real information for a user. We present an OOV term definition extraction method by using information available from the Internet. We use features such as occurrence of the synonyms and location distances. We apply machine learning method to find the correct definitions for OOV terms. We tested our method on both biomedical type and name type OOV terms, our work outperforms existing work with an accuracy of 86.5%.

Keywords: information retrieval, definition retrieval, OOV (out of vocabulary), biomedical information retrieval

Procedia PDF Downloads 489
24698 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 502
24697 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 423
24696 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 284
24695 Understanding the Top Questions Asked about Hong Kong by Travellers Worldwide through a Corpus-Based Discourse Analytic Approach

Authors: Phoenix W. Y. Lam

Abstract:

As one of the most important service-oriented industries in contemporary society, tourism has increasingly seen the influence of the Internet on all aspects of travelling. Travellers nowadays habitually research online before making travel-related decisions. One platform on which such research is conducted is destination forums. The emergence of such online destination forums in the last decade has allowed tourists to share their travel experiences quickly and easily with a large number of online users around the world. As such, these destination forums also provide invaluable data for tourism bodies to better understand travellers’ views on their destinations. Collecting posts from the Hong Kong travel forum on the world’s largest travel website TripAdvisor®, the present study identifies the top questions asked by TripAdvisor users about Hong Kong through a corpus-based discourse analytic approach. Based on questions posted on the forum and their associated meta-data gathered in a one-year period, the study examines the top questions asked by travellers around the world to identify the key geographical locations in which users have shown the greatest interest in the city. Questions raised by travellers from different geographical locations are also compared to see if traveller communities by location vary in terms of their areas of interest. This analysis involves the study of key words and concordance of frequently-occurring items and a close reading of representative examples in context. Findings from the present study show that travellers who asked the most questions about Hong Kong are from North America and Asia, and that travellers from different locations have different concerns and interests, which are clearly reflected in the language of the questions asked on the travel forum. These findings can therefore provide tourism organisations with useful information about the key markets that should be targeted for promotional purposes, and can also allow such organisations to design advertising campaigns which better address the specific needs of such markets. The present study thus demonstrates the value of applying linguistic knowledge and methodologies to the domain of tourism to address practical issues.

Keywords: corpus, hong kong, online travel forum, tourism, TripAdvisor

Procedia PDF Downloads 176
24694 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 195