Search results for: data mining techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29882

Search results for: data mining techniques

25682 Multilevel Gray Scale Image Encryption through 2D Cellular Automata

Authors: Rupali Bhardwaj

Abstract:

Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.

Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient

Procedia PDF Downloads 378
25681 Total Longitudinal Displacement (tLoD) of the Common Carotid Artery (CCA) Does Not Differ between Patients with Moderate or High Cardiovascular Risk (CV) and Patients after Acute Myocardial Infarction (AMI)

Authors: P. Serpytis, K. Azukaitis, U. Gargalskaite, R. Navickas, J. Badariene, V. Dzenkeviciute

Abstract:

Purpose: Total longitudinal displacement (tLoD) of the common carotid artery (CCA) wall is a novel ultrasound marker of vascular function that can be evaluated using modified speckle tracking techniques. Decreased CCA tLoD has already been shown to be associated with diabetes and was shown to predict one year cardiovascular outcome in patients with suspected coronary artery disease (CAD) . The aim of our study was to evaluate if CCA tLoD differ between patients with moderate or high cardiovascular (CV) risk and patients after recent acute myocardial infarction (AMI). Methods: 49 patients (54±6 years) with moderate or high CV risk and 42 patients (58±7 years) after recent AMI were included. All patients were non-diabetic. CCA tLoD was evaluated using GE EchoPAC speckle tracking software and expressed as mean of both sides. Data on systolic blood pressure, total and high density lipoprotein (HDL) cholesterol levels, high sensitivity C-reactive protein (hsCRP) level, smoking status and family history of early CV events was evaluated and assessed for association with CCA tLoD. Results: tLoD of CCA did not differ between patients with moderate or high CV risk and patients with very high CV risk after MI (0.265±0.128 mm vs. 0.237±0.103 mm, p>0.05). Lower tLoD was associated with lower HDL cholesterol levels (r=0.211, p=0.04) and male sex (0.228±0.1 vs. 0.297±0.134, p=0.01). Conclusions: tLoD of CCA did not differ between patients with moderate or high CV risk and patients with very high CV risk after AMI. However, lower CCA tLoD was significantly associated with low HDL cholesterol levels and male sex.

Keywords: total longitudinal displacement, carotid artery, cardiovascular risk, acute myocardial infarction

Procedia PDF Downloads 384
25680 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: network worms, malware infection propagating malicious code, virus, security, VPN

Procedia PDF Downloads 358
25679 Interactive IoT-Blockchain System for Big Data Processing

Authors: Abdallah Al-ZoubI, Mamoun Dmour

Abstract:

The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.

Keywords: IoT devices, blockchain, Ethereum, big data

Procedia PDF Downloads 150
25678 Keynote Talk: The Role of Internet of Things in the Smart Cities Power System

Authors: Abdul-Rahman Al-Ali

Abstract:

As the number of mobile devices is growing exponentially, it is estimated to connect about 50 million devices to the Internet by the year 2020. At the end of this decade, it is expected that an average of eight connected devices per person worldwide. The 50 billion devices are not mobile phones and data browsing gadgets only, but machine-to-machine and man-to-machine devices. With such growing numbers of devices the Internet of Things (I.o.T) concept is one of the emerging technologies as of recently. Within the smart grid technologies, smart home appliances, Intelligent Electronic Devices (IED) and Distributed Energy Resources (DER) are major I.o.T objects that can be addressable using the IPV6. These objects are called the smart grid internet of things (SG-I.o.T). The SG-I.o.T generates big data that requires high-speed computing infrastructure, widespread computer networks, big data storage, software, and platforms services. A company’s utility control and data centers cannot handle such a large number of devices, high-speed processing, and massive data storage. Building large data center’s infrastructure takes a long time, it also requires widespread communication networks and huge capital investment. To maintain and upgrade control and data centers’ infrastructure and communication networks as well as updating and renewing software licenses which collectively, requires additional cost. This can be overcome by utilizing the emerging computing paradigms such as cloud computing. This can be used as a smart grid enabler to replace the legacy of utilities data centers. The talk will highlight the role of I.o.T, cloud computing services and their development models within the smart grid technologies.

Keywords: intelligent electronic devices (IED), distributed energy resources (DER), internet, smart home appliances

Procedia PDF Downloads 324
25677 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: data analysis, interferon gamma release assay, statistical methods, tuberculosis infection

Procedia PDF Downloads 306
25676 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: steganalysis, security, Fast Fourier Transform, streaming media

Procedia PDF Downloads 147
25675 Exclusive Value Adding by iCenter Analytics on Transient Condition

Authors: Zhu Weimin, Allegorico Carmine, Ruggiero Gionata

Abstract:

During decades of Baker Hughes (BH) iCenter experience, it is demonstrated that in addition to conventional insights on equipment steady operation conditions, insights on transient conditions can add significant and exclusive value for anomaly detection, downtime saving, and predictive maintenance. Our work shows examples from the BH iCenter experience to introduce the advantages and features of using transient condition analytics: (i) Operation under critical engine conditions: e.g., high level or high change rate of temperature, pressure, flow, vibration, etc., that would not be reachable in normal operation, (ii) Management of dedicated sub-systems or components, many of which are often bottlenecks for reliability and maintenance, (iii) Indirect detection of anomalies in the absence of instrumentation, (iv) Repetitive sequences: if data is properly processed, the engineering features of transients provide not only anomaly detection but also problem characterization and prognostic indicators for predictive maintenance, (v) Engine variables accounting for fatigue analysis. iCenter has been developing and deploying a series of analytics based on transient conditions. They are contributing to exclusive value adding in the following areas: (i) Reliability improvement, (ii) Startup reliability improvement, (iii) Predictive maintenance, (iv) Repair/overhaul cost down. Illustrative examples for each of the above areas are presented in our study, focusing on challenges and adopted techniques ranging from purely statistical approaches to the implementation of machine learning algorithms. The obtained results demonstrate how the value is obtained using transient condition analytics in the BH iCenter experience.

Keywords: analytics, diagnostics, monitoring, turbomachinery

Procedia PDF Downloads 74
25674 A Mathematical Framework for Expanding a Railway’s Theoretical Capacity

Authors: Robert L. Burdett, Bayan Bevrani

Abstract:

Analytical techniques for measuring and planning railway capacity expansion activities have been considered in this article. A preliminary mathematical framework involving track duplication and section sub divisions is proposed for this task. In railways, these features have a great effect on network performance and for this reason they have been considered. Additional motivations have also arisen from the limitations of prior models that have not included them.

Keywords: capacity analysis, capacity expansion, railways, track sub division, track duplication

Procedia PDF Downloads 359
25673 Effects of Silver Nanoparticles on in vitro Adventitious Shoot Regeneration of Water Hyssop (Bacopa monnieri L. Wettst.)

Authors: Muhammad Aasim, Mehmet Karataş, Fatih Erci, Şeyma Bakırcı, Ecenur Korkmaz, Burak Kahveci

Abstract:

Water hyssop (Bacopa monnieri L. Wettst.) is an important medicinal aquatic/semi aquatic plant native to India where it is used in traditional medicinal system. The plant contains bioactive compounds mainly Bacosides which are the main ingridient of commercial drug available as memory enhancer tonic. The local name of water hyssop is Brahmi and brahmi based drugs are available against for curing chronic diseases and disorders Alzheimer’s disease, anxiety, asthma, cancer, mental illness, respiratory ailments, and stomach ulcers. The plant is not a cultivated plant and collection of plant from nature make palnt threatened to endangered. On the other hand, low seed viability and availability make it difficult to propagate plant through traditional techniques. In recent years, plant tissue culture techniques have been employed to propagate plant for its conservation and production for continuous availability of secondary metabolites. On the other hand, application of nanoparticles has been reported for increasing biomass, in vitro regeneration and secondary metabolites production. In this study, silver nanoparticles (AgNPs) were applied at the rate of 2, 4, 6, 8 and 10 ppm to Murashihe and Skoog (MS) medium supplemented with 1.0 mg/l Benzylaminopurine (BAP), 3.0% sucrose and 0.7% agar. Leaf explants of water hyssop were cultured on AgNPs containing medium. Shoot induction from leaf explants were relatively slow compared to medium without AgNPs. Multiple shoot induction was recorded after 3-4 weeks of culture comapred to control that occured within 10 days. Regenerated shoots were rooted successfully on MS medium supplemented with 1.0 mg/l IBA and acclimatized in the aquariums for further studies.

Keywords: Water hyssop, Silver nanoparticles, In vitro, Regeneration, Secondary metabolites

Procedia PDF Downloads 196
25672 Privacy-Preserving Model for Social Network Sites to Prevent Unwanted Information Diffusion

Authors: Sanaz Kavianpour, Zuraini Ismail, Bharanidharan Shanmugam

Abstract:

Social Network Sites (SNSs) can be served as an invaluable platform to transfer the information across a large number of individuals. A substantial component of communicating and managing information is to identify which individual will influence others in propagating information and also whether dissemination of information in the absence of social signals about that information will be occurred or not. Classifying the final audience of social data is difficult as controlling the social contexts which transfers among individuals are not completely possible. Hence, undesirable information diffusion to an unauthorized individual on SNSs can threaten individuals’ privacy. This paper highlights the information diffusion in SNSs and moreover it emphasizes the most significant privacy issues to individuals of SNSs. The goal of this paper is to propose a privacy-preserving model that has urgent regards with individuals’ data in order to control availability of data and improve privacy by providing access to the data for an appropriate third parties without compromising the advantages of information sharing through SNSs.

Keywords: anonymization algorithm, classification algorithm, information diffusion, privacy, social network sites

Procedia PDF Downloads 321
25671 Dimension of Water Accessibility in the Southern Part of Niger State, Nigeria

Authors: Kudu Dangana, Pai H. Halilu, Osesienemo R. Asiribo-Sallau, Garba Inuwa Kuta

Abstract:

The study examined the determinants of household water accessibility in Southern part of Niger State, Nigeria. Data for the study was obtained from primary and secondary sources using questionnaire, interview, personal observation and documents. 1,192 questionnaires were administered; sampling techniques adopted are combination of purposive, stratified and simple random. Purposive sampling technique was used to determine sample frame; sample unit was determined using stratified sampling method and simple random technique was used in administering questionnaires. The result was analyzed within the scope of “WHO” water accessibility indicators using descriptive statistics. Major sources of water in the area are well; hand and electric pump borehole and streams. These sources account for over 90% of household’s water. Average per capita water consumption in the area is 22 liters per day, while location efficiency of facilities revealed an average of 80 people per borehole. Household water accessibility is affected mainly by the factors of distances, time spent to obtain water, low income status of the majority of respondents to access modern water infrastructure, and to a lesser extent household size. Recommendations includes, all tiers of government to intensify efforts in providing water infrastructures and existing ones through budgetary provisions, and communities should organize fund raising bazaar, so as to raise fund to improve water infrastructures in the area.

Keywords: accessibility, determined, stratified, scope

Procedia PDF Downloads 392
25670 Self-Determination among Individuals with Intellectual Disability: An Experiment

Authors: Wasim Ahmad, Bir Singh Chavan, Nazli Ahmad

Abstract:

Objectives: The present investigation is an attempt to find out the efficacy of training the special educators on promoting self-determination among individuals with intellectual disability. Methods: The study equipped the special educators with necessary skills and knowledge to train individuals with the intellectual disability for practicing self-determination. Subjects: Special educators (N=25) were selected for training on self-determination among individuals with intellectual disability. After receiving the training, (N=50) individuals with an intellectual disability were selected and intervened by the trained special educators. Tool: Self-Determination Scale for Adults with Mild Mental Retardation (SDSAMR) developed by Keshwal and Thressiakutty (2010) has been used. It’s a reliable and valid tool used by many researchers. It has 36 items distributed in five domains namely: personal management, community participation, recreation and leisure time, choice making and problem solving. Analysis: The collected data was analyzed using the statistical techniques such as t-test, ANCOVA, and Posthoc Tuckey test. Results: The findings of the study reveal that there is a significant difference at 1% level in the pre and post tests mean scores (t-15.56) of self-determination concepts among the special educators. This indicates that the training enhanced the performance of special educators on the concept of self-determination among individuals with intellectual disability. The study also reveals that the training received on transition planning by the special educators found to be effective because they were able to practice the concept by imparting and training the individuals with intellectual disability to if determined. The results show that there was a significant difference at 1% level in the pre and post tests mean scores (t-16.61) of self-determination among individuals with intellectual disability. Conclusion: To conclude it can be said that the training has a remarkable impact on the performance of the individuals with intellectual disability on self-determination.

Keywords: experiment, individuals with intellectual disability, self-determination, special educators

Procedia PDF Downloads 335
25669 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations

Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar

Abstract:

Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.

Keywords: cache behaviour, network-on-chip, performance profiling, vectorization

Procedia PDF Downloads 199
25668 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 455
25667 Assessment of Environmental Implications of Rapid Population Growth on Land Use Dynamics: A Case Study of Eleme Local Government Area, Rivers State, Nigeria

Authors: Moses Obenade, Henry U. Okeke, Francis I. Okpiliya, Eugene J. Aniah

Abstract:

Population growth in Eleme has been rapid over the past 75 years with its attendant pressure on the natural resources of the area. Between 1937 and 2006 the population of Eleme grew from 2,528 to 190,194 and is projected to be above 265,707 in 2016 based on an annual growth rate of 3.4%. Using the combined technologies of Geographic Information Systems (GIS), remote sensing (RS) and Demography techniques as its methodology, this paper examines the environmental implications of rapid population growth on land use dynamics in Eleme between 1986 and 2015. The study reveals that between 1986 and 2006, Built-up area and Farmland increased by 72.67 and 12.77% respectively, while light and thick vegetation recorded a decrease of -6.92 and -61.64% respectively. Water body remains fairly constant with minimal changes. Also, between 2006 and 2015 covering a period of 9 years, Built-up area further increased by 53% with an annual growth rate of 2.32 km2 gaining more land area on the detriment of other land uses. Built-up area has an annual growth rate of 2.32km2 and is expected to increase from 18.67km2 in 2006 to 41.87km2 in 2016.The observed Land used/Land cover dynamics is derived by the demographic characteristics of the Study area. Eleme has a total area of 138km2 out of which the Federal Government of Nigeria compulsorily acquired an estimated area of 59.34km2 for industrial purposes excluding acquisitions by the Rivers State Government. It is evident from the findings of this study that the carrying capacity of Eleme ecosystem is under threat due to the current population growth and land consumption rates. Therefore, measures such as use of appropriate technologies in farming techniques, waste management; investment in family planning and female empowerment, maternal health and education, afforestation programs; and amendment of Land Use Act of 1978 are recommended.

Keywords: population growth, Eleme, land use, GIS and remote sensing

Procedia PDF Downloads 381
25666 Reshaping of Indian Education System with the Help of Multi-Media: Promises and Pitfalls

Authors: Geetu Gahlawat

Abstract:

The education system accustomed information on daily basis in term of variety i.e Multimedia channel. This can create a challenge to pedagogue to get hold on learner. Multimedia enhance the education system with its technology. Educators deliver their content effectively and beyond any limit through multimedia elements on another side it gives easy learning to learners and they are able to get their goals fast. This paper gives an overview of how multimedia reshape the Indian education system with its promises and pitfalls.

Keywords: multimedia, technology, techniques, development, pedagogy

Procedia PDF Downloads 282
25665 Navigating Government Finance Statistics: Effortless Retrieval and Comparative Analysis through Data Science and Machine Learning

Authors: Kwaku Damoah

Abstract:

This paper presents a methodology and software application (App) designed to empower users in accessing, retrieving, and comparatively exploring data within the hierarchical network framework of the Government Finance Statistics (GFS) system. It explores the ease of navigating the GFS system and identifies the gaps filled by the new methodology and App. The GFS, embodies a complex Hierarchical Network Classification (HNC) structure, encapsulating institutional units, revenues, expenses, assets, liabilities, and economic activities. Navigating this structure demands specialized knowledge, experience, and skill, posing a significant challenge for effective analytics and fiscal policy decision-making. Many professionals encounter difficulties deciphering these classifications, hindering confident utilization of the system. This accessibility barrier obstructs a vast number of professionals, students, policymakers, and the public from leveraging the abundant data and information within the GFS. Leveraging R programming language, Data Science Analytics and Machine Learning, an efficient methodology enabling users to access, navigate, and conduct exploratory comparisons was developed. The machine learning Fiscal Analytics App (FLOWZZ) democratizes access to advanced analytics through its user-friendly interface, breaking down expertise barriers.

Keywords: data science, data wrangling, drilldown analytics, government finance statistics, hierarchical network classification, machine learning, web application.

Procedia PDF Downloads 70
25664 The Cleavage of DNA by the Anti-Tumor Drug Bleomycin at the Transcription Start Sites of Human Genes Using Genome-Wide Techniques

Authors: Vincent Murray

Abstract:

The glycopeptide bleomycin is used in the treatment of testicular cancer, Hodgkin's lymphoma, and squamous cell carcinoma. Bleomycin damages and cleaves DNA in human cells, and this is considered to be the main mode of action for bleomycin's anti-tumor activity. In particular, double-strand breaks are thought to be the main mechanism for the cellular toxicity of bleomycin. Using Illumina next-generation DNA sequencing techniques, the genome-wide sequence specificity of bleomycin-induced double-strand breaks was determined in human cells. The degree of bleomycin cleavage was also assessed at the transcription start sites (TSSs) of actively transcribed genes and compared with non-transcribed genes. It was observed that bleomycin preferentially cleaved at the TSSs of actively transcribed human genes. There was a correlation between the degree of this enhanced cleavage at TSSs and the level of transcriptional activity. Bleomycin cleavage is also affected by chromatin structure and at TSSs, the peaks of bleomycin cleavage were approximately 200 bp apart. This indicated that bleomycin was able to detect phased nucleosomes at the TSSs of actively transcribed human genes. The genome-wide cleavage pattern of the bleomycin analogues 6′-deoxy-BLM Z and zorbamycin was also investigated in human cells. As found for bleomycin, these bleomycin analogues also preferentially cleaved at the TSSs of actively transcribed human genes. The cytotoxicity (IC₅₀ values) of these bleomycin analogues was determined. It was found that the degree of enhanced cleavage at TSSs was inversely correlated with the IC₅₀ values of the bleomycin analogues. This suggested that the level of cleavage at the TSSs of actively transcribed human genes was important for the cytotoxicity of bleomycin and analogues. Hence this study provided a deeper understanding of the cellular processes involved in the cancer chemotherapeutic activity of bleomycin.

Keywords: anti-tumour activity, bleomycin analogues, chromatin structure, genome-wide study, Illumina DNA sequencing

Procedia PDF Downloads 120
25663 Assessment of the Impact of Trawling Activities on Marine Bottoms of Moroccan Atlantic

Authors: Rachida Houssa, Hassan Rhinane, Fadoumo Ali Malouw, Amina Oulmaalem

Abstract:

Since the early 70s, the Moroccan Atlantic sea was subjected to the pressure of the bottom trawling, one of the most destructive techniques seabed that cause havoc on fishing catch, nonselective, and responsible for more than half of all releases of fish around the world. The present paper aims to map and assess the impact of the activity of the bottom trawling of the Moroccan Atlantic coast. For this purpose, a dataset of thirty years, between 1962 and 1999, from foreign fishing vessels using bottom trawling, has been used and integrated in a GIS. To estimate the extent and the importance of the geographical distribution of the trawling effort, the Moroccan Atlantic area was divided into a grid of cells of 25 km2 (5x5 km). This grid was joined to the effort trawling data, creating a new entity with a table containing spatial overlay grid with the polygon of swept surfaces. This mapping model allowed to quantify the used fishing effort versus time and to generate the trace indicative of trawling efforts on the seabed. Indeed, for a given year, a grid cell may have a swept area equal to 0 (never been touched by the trawl) or 25 km2 (the trawled area is similar to the cell size) or may be 100 km2 indicating that for this year, the scanned surface is four times the cell area. The results show that the total cumulative sum of trawled area is approximately 28,738,326 km2, scattered throughout the Atlantic coast. 95% of the overall trawling effort is located in the southern zone, between 29°N and 20°30'N. Nearly 5% of the trawling effort is located in the northern coastal region, north of 33°N. The center area between 33°N and 29°N is the least swept by Russian commercial vessels because in this region the majority of the area is rocky, and non trawlable.

Keywords: GIS, Moroccan Atlantic Ocean, seabed, trawling

Procedia PDF Downloads 329
25662 Value Chain Based New Business Opportunity

Authors: Seonjae Lee, Sungjoo Lee

Abstract:

Excavation is necessary to remain competitive in the current business environment. The company survived the rapidly changing industry conditions by adapting new business strategy and reducing technology challenges. Traditionally, the two methods are conducted excavations for new businesses. The first method is, qualitative analysis of expert opinion, which is gathered through opportunities and secondly, new technologies are discovered through quantitative data analysis of method patents. The second method increases time and cost. Patent data is restricted for use and the purpose of discovering business opportunities. This study presents the company's characteristics (sector, size, etc.), of new business opportunities in customized form by reviewing the value chain perspective and to contributing to creating new business opportunities in the proposed model. It utilizes the trademark database of the Korean Intellectual Property Office (KIPO) and proprietary company information database of the Korea Enterprise Data (KED). This data is key to discovering new business opportunities with analysis of competitors and advanced business trademarks (Module 1) and trading analysis of competitors found in the KED (Module 2).

Keywords: value chain, trademark, trading analysis, new business opportunity

Procedia PDF Downloads 373
25661 Towards Addressing the Cultural Snapshot Phenomenon in Cultural Mapping Libraries

Authors: Mousouris Spiridon, Kavakli Evangelia

Abstract:

This paper focuses on Digital Libraries (DLs) that contain and geovisualise cultural data, highlighting the need to define them as a separate category termed Cultural Mapping Libraries, based on their inherent connection of culture with geographic location and their design requirements in support of visual representation of cultural data on the map. An exploratory analysis of DLs that conform to the above definition brought forward the observation that existing Cultural Mapping Libraries fail to geovisualise the entirety of cultural data per point of interest thus resulting in a Cultural Snapshot phenomenon. The existence of this phenomenon was reinforced by the results of a systematic bibliographic research. In order to address the Cultural Snapshot, this paper proposes the use of the Semantic Web principles to efficiently interconnect spatial cultural data through time, per geographic location. In this way points of interest are transformed into scenery where culture evolves over time. This evolution is expressed as occurrences taking place chronologically, in an event oriented approach, a conceptualization also endorsed by the CIDOC Conceptual Reference Model (CIDOC CRM). In particular, we posit the use of CIDOC CRM as the baseline for defining the logic of Cultural Mapping Libraries as part of the Culture Domain in accordance with the Digital Library Reference Model, in order to define the rules of cultural data management by the system. Our future goal is to transform this conceptual definition in to inferencing rules that resolve the Cultural Snapshot and lead to a more complete geovisualisation of cultural data.

Keywords: digital libraries, semantic web, geovisualization, CIDOC-CRM

Procedia PDF Downloads 109
25660 An Evaluation of the Impact of E-Banking on Operational Efficiency of Banks in Nigeria

Authors: Ibrahim Rabiu Darazo

Abstract:

The research has been conducted on the impact of E-banking on the operational efficiency of Banks in Nigeria, A case of some selected banks (Diamond Bank Plc, GTBankPlc, and Fidelity Bank Plc) in Nigeria. The research is a quantitative research which uses both primary and secondary sources of data collection. Questionnaire were used to obtained accurate data, where 150 Questionnaire were distributed among staff and customers of the three Banks , and the data collected where analysed using chi-square, whereas the secondary data where obtained from relevant text books, journals and relevant web sites. It is clear from the findings that, the use of e-banking by the banks has improved the efficiency of these banks, in terms of providing efficient services to customers electronically, using Internet Banking, Telephone Banking ATMs, reducing time taking to serve customers, e-banking allow new customers to open an account online, customers have access to their account at all the time 24/7.E-banking provide access to customers information from the data base and cost of check and postage were eliminated using e-banking. The recommendation at the end of the research include; the Banks should try to update their electronic gadgets, e-fraud(internal & external) should also be controlled, Banks shall employ qualified man power, Biometric ATMs shall be introduce to reduce fraud using ATM Cards, as it is use in other countries like USA.

Keywords: banks, electronic banking, operational efficiency of banks, biometric ATMs

Procedia PDF Downloads 333
25659 A Case Study on the Census of Technological Capacities in Health Care in Rural Sanitary Institutions in South Cameroon

Authors: Doriane Micaela Andeme Bikoro, Samuel Fosso Wamba, Jean Robert Kala Kamdjoug

Abstract:

Currently one of the leading fields in the market of technological innovation is digital health. In developed countries, this booming innovation is experiencing an exponential speed. We understand that in developed countries, e-health could also revolutionize the practice of medicine and therefore fill the many failures observed in medical care. Everything leads to believe that future technology is oriented towards the medical sector. The aim of this work is to explore at the same time the technological resources and the potential of health care based on new technologies; it is a case study in a rural area of Southern Cameroon. Among other things, we will make a census of the shortcomings and problems encountered, and we will propose various appropriate solutions. The work methodology used here is essentially qualitative. We used two qualitative data collection techniques, direct observation, and interviews. In fact, we spent two weeks in the field observing and conducting some semi-directive interviews with some of those responsible for these health structures. This study was conducted in three health facilities in the south of the country; including two health centers and a rural hospital. Many technological failures have been identified in the day-to-day management of these health facilities and especially in the administration of health care to patients. We note major problems such as the digital divide, the lack of qualified personnel, the state of isolation of this area. This is why various proposals are made to improve the health sector in Cameroon both technologically and medically.

Keywords: Cameroon, capacities, census, digital health, qualitative method, rural area

Procedia PDF Downloads 144
25658 Suitability of Satellite-Based Data for Groundwater Modelling in Southwest Nigeria

Authors: O. O. Aiyelokun, O. A. Agbede

Abstract:

Numerical modelling of groundwater flow can be susceptible to calibration errors due to lack of adequate ground-based hydro-metrological stations in river basins. Groundwater resources management in Southwest Nigeria is currently challenged by overexploitation, lack of planning and monitoring, urbanization and climate change; hence to adopt models as decision support tools for sustainable management of groundwater; they must be adequately calibrated. Since river basins in Southwest Nigeria are characterized by missing data, and lack of adequate ground-based hydro-meteorological stations; the need for adopting satellite-based data for constructing distributed models is crucial. This study seeks to evaluate the suitability of satellite-based data as substitute for ground-based, for computing boundary conditions; by determining if ground and satellite based meteorological data fit well in Ogun and Oshun River basins. The Climate Forecast System Reanalysis (CFSR) global meteorological dataset was firstly obtained in daily form and converted to monthly form for the period of 432 months (January 1979 to June, 2014). Afterwards, ground-based meteorological data for Ikeja (1981-2010), Abeokuta (1983-2010), and Oshogbo (1981-2010) were compared with CFSR data using Goodness of Fit (GOF) statistics. The study revealed that based on mean absolute error (MEA), coefficient of correlation, (r) and coefficient of determination (R²); all meteorological variables except wind speed fit well. It was further revealed that maximum and minimum temperature, relative humidity and rainfall had high range of index of agreement (d) and ratio of standard deviation (rSD), implying that CFSR dataset could be used to compute boundary conditions such as groundwater recharge and potential evapotranspiration. The study concluded that satellite-based data such as the CFSR should be used as input when constructing groundwater flow models in river basins in Southwest Nigeria, where majority of the river basins are partially gaged and characterized with long missing hydro-metrological data.

Keywords: boundary condition, goodness of fit, groundwater, satellite-based data

Procedia PDF Downloads 130
25657 Evaluation of Two DNA Extraction Methods for Minimal Porcine (Pork) Detection in Halal Food Sample Mixture Using Taqman Real-time PCR Technique

Authors: Duaa Mughal, Syeda Areeba Nadeem, Shakil Ahmed, Ishtiaq Ahmed Khan

Abstract:

The identification of porcine DNA in Halal food items is critical to ensuring compliance with dietary restrictions and religious beliefs. In Islam, Porcine is prohibited as clearly mentioned in Quran (Surah Al-Baqrah, Ayat 173). The purpose of this study was to compare two DNA extraction procedures for detecting 0.001% of porcine DNA in processed Halal food sample mixtures containing chicken, camel, veal, turkey and goat meat using the TaqMan Real-Time PCR technology. In this research, two different commercial kit protocols were compared. The processed sample mixtures were prepared by spiking known concentration of porcine DNA to non-porcine food matrices. Afterwards, TaqMan Real-Time PCR technique was used to target a particular porcine gene from the extracted DNA samples, which was quantified after extraction. The results of the amplification were evaluated for sensitivity, specificity, and reproducibility. The results of the study demonstrated that two DNA extraction techniques can detect 0.01% of porcine DNA in mixture of Halal food samples. However, as compared to the alternative approach, Eurofins| GeneScan GeneSpin DNA Isolation kit showed more effective sensitivity and specificity. Furthermore, the commercial kit-based approach showed great repeatability with minimal variance across repeats. Quantification of DNA was done by using fluorometric assay. In conclusion, the comparison of DNA extraction methods for detecting porcine DNA in Halal food sample mixes using the TaqMan Real-Time PCR technology reveals that the commercial kit-based approach outperforms the other methods in terms of sensitivity, specificity, and repeatability. This research helps to promote the development of reliable and standardized techniques for detecting porcine DNA in Halal food items, religious conformity and assuring nutritional.

Keywords: real time PCR (qPCR), DNA extraction, porcine DNA, halal food authentication, religious conformity

Procedia PDF Downloads 78
25656 Bioremediation of Paper Mill Effluent by Microbial Consortium Comprising Bacterial and Fungal Strain and Optimizing the Effect of Carbon Source

Authors: Priya Tomar, Pallavi Mittal

Abstract:

Bioremediation has been recognized as an environment friendly and less expensive method which involves the natural processes resulting in the efficient conversion of hazardous compounds into innocuous products. The pulp and paper mill effluent is one of the high polluting effluents amongst the effluents obtained from polluting industries. The colouring body present in the wastewater from pulp and paper mill is organic in nature and is comprised of wood extractives, tannin, resins, synthetic dyes, lignin, and its degradation products formed by the action of chlorine on lignin which imparts an offensive colour to the water. These mills use different chemical process for paper manufacturing due to which lignified chemicals are released into the environment. Therefore, the chemical oxygen demand (COD) of the emanating stream is quite high. For solving the above problem we present this paper with some new techniques that were developed for the efficiency of paper mill effluents. In the present study we utilized the consortia of fungal and bacterial strain and the treatment named as C1, C2, and C3 for the decolourization of paper mill effluent. During the study, role of carbon source i.e. glucose was studied for decolourization. From the results it was observed that a maximum colour reduction of 66.9%, COD reduction of 51.8%, TSS reduction of 0.34%, TDS reduction of 0.29% and pH changes of 4.2 is achieved by consortia of Aspergillus niger with Pseudomonas aeruginosa. Data indicated that consortia of Aspergillus niger with Pseudomonas aeruginosa is giving better result with glucose.

Keywords: bioremediation, decolourization, black liquor, mycoremediation

Procedia PDF Downloads 411
25655 Automatic Content Curation of Visual Heritage

Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz

Abstract:

Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.

Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research

Procedia PDF Downloads 184
25654 Application of Enzyme-Mediated Calcite Precipitation for Surface Control of Gold Mining Tailing Waste

Authors: Yogi Priyo Pradana, Heriansyah Putra, Regina Aprilia Zulfikar, Maulana Rafiq Ramadhan, Devyan Meisnnehr, Zalfa Maulida Insani

Abstract:

This paper studied the effects and mechanisms of fine-grained tailing by Enzyme-Mediated Calcite Precipitation (EMCP). Grouting solution used consists of reagents (CaCl₂ and (CO(NH₂)₂) and urease enzymes which react to produce CaCO₃. In sample preparation, the test tube is used to investigate the precipitation rate of calcite. The grouting solution added is 75 mL for one mold sample. The solution was poured into a mold sample up to as high as 5 mm from the top surface of the tailing to ensure the entire surface is submerged. The sample is left open in a cylinder for up to 3 days for curing. The direct mixing method is conducted so that the cementation process occurs by evenly distributed. The relationship between the results of the UCS test and the calcite precipitation rate likely indicates that the amount of calcite deposited in treated tailing could control the strength of the tailing. The sample results are analyzed using atomic absorption spectroscopy (AAS) to evaluate metal and metalloid content. Calcium carbonate deposited in the tailing is expected to strengthen the bond between tailing granules, which are easily slipped on the banks of the tailing dam. The EMCP method is expected to strengthen tailing in erosion-control surfaces.

Keywords: tailing, EMCP, UCS, AAS

Procedia PDF Downloads 138
25653 An Intelligent Prediction Method for Annular Pressure Driven by Mechanism and Data

Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li, Shuo Zhu, Shiming Duan, Xuezhe Yao

Abstract:

Accurate calculation of wellbore pressure is of great significance to prevent wellbore risk during drilling. The traditional mechanism model needs a lot of iterative solving procedures in the calculation process, which reduces the calculation efficiency and is difficult to meet the demand of dynamic control of wellbore pressure. In recent years, many scholars have introduced artificial intelligence algorithms into wellbore pressure calculation, which significantly improves the calculation efficiency and accuracy of wellbore pressure. However, due to the ‘black box’ property of intelligent algorithm, the existing intelligent calculation model of wellbore pressure is difficult to play a role outside the scope of training data and overreacts to data noise, often resulting in abnormal calculation results. In this study, the multi-phase flow mechanism is embedded into the objective function of the neural network model as a constraint condition, and an intelligent prediction model of wellbore pressure under the constraint condition is established based on more than 400,000 sets of pressure measurement while drilling (MPD) data. The constraint of the multi-phase flow mechanism makes the prediction results of the neural network model more consistent with the distribution law of wellbore pressure, which overcomes the black-box attribute of the neural network model to some extent. The main performance is that the accuracy of the independent test data set is further improved, and the abnormal calculation values basically disappear. This method is a prediction method driven by MPD data and multi-phase flow mechanism, and it is the main way to predict wellbore pressure accurately and efficiently in the future.

Keywords: multiphase flow mechanism, pressure while drilling data, wellbore pressure, mechanism constraints, combined drive

Procedia PDF Downloads 174