Search results for: data analyses
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26900

Search results for: data analyses

25790 Analysis of Expression Data Using Unsupervised Techniques

Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.

Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation

Procedia PDF Downloads 144
25789 Learning Analytics in a HiFlex Learning Environment

Authors: Matthew Montebello

Abstract:

Student engagement within a virtual learning environment generates masses of data points that can significantly contribute to the learning analytics that lead to decision support. Ideally, similar data is collected during student interaction with a physical learning space, and as a consequence, data is present at a large scale, even in relatively small classes. In this paper, we report of such an occurrence during classes held in a HiFlex modality as we investigate the advantages of adopting such a methodology. We plan to take full advantage of the learner-generated data in an attempt to further enhance the effectiveness of the adopted learning environment. This could shed crucial light on operating modalities that higher education institutions around the world will switch to in a post-COVID era.

Keywords: HiFlex, big data in higher education, learning analytics, virtual learning environment

Procedia PDF Downloads 193
25788 Influence of Hydrogen Ion Concentration on the Production of Bio-Synthesized Nano-Silver

Authors: M.F. Elkady, Sahar Zaki, Desouky Abd-El-Haleem

Abstract:

Silver nanoparticles (AgNPs) are already widely prepared using different technologies. However, there are limited data on the effects of hydrogen ion concentration on nano-silver production. In this investigation, the impact of the pH reaction medium toward the particle size, agglomeration and the yield of the produced bio-synthesized silver were established. Quasi-spherical silver nanoparticles were synthesized through the biosynthesis green production process using the Egyptian E. coli bacterial strain 23N at different pH values. The formation of AgNPs has been confirmed with ultraviolet–visible spectra through identification of their characteristic peak at 410 nm. The quantitative production yield and the orientation planes of the produced nano-silver were examined using X-ray spectroscopy (EDS) and X-ray diffraction (XRD). Quantitative analyses indicated that the silver production yield was promoted at elevated pH regarded to increase the reduction rate of silver precursor through both chemical and biological processes. As a result, number of the nucleus and thus the size of the silver nanoparticles were tunable through changing pH of the reaction system. Accordingly, the morphological structure and size of the produced silver and its aggregates were determined using scanning electron microscopy (SEM) and transmission electron microscopy (TEM) images. It was considered that the increment in pH value of the reaction media progress the aggregation of silver clusters. However, the presence of stain 23N biomass decreases the possibility of silver aggregation at the pH 7.

Keywords: silver nanoparticles, biosynthesis, reaction media pH, nano-silver characterization

Procedia PDF Downloads 367
25787 Li-Fi Technology: Data Transmission through Visible Light

Authors: Shahzad Hassan, Kamran Saeed

Abstract:

People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.

Keywords: communication, LED, Li-Fi, Wi-Fi

Procedia PDF Downloads 335
25786 An Analysis of Humanitarian Data Management of Polish Non-Governmental Organizations in Ukraine Since February 2022 and Its Relevance for Ukrainian Humanitarian Data Ecosystem

Authors: Renata Kurpiewska-Korbut

Abstract:

Making an assumption that the use and sharing of data generated in humanitarian action constitute a core function of humanitarian organizations, the paper analyzes the position of the largest Polish humanitarian non-governmental organizations in the humanitarian data ecosystem in Ukraine and their approach to non-personal and personal data management since February of 2022. Both expert interviews and document analysis of non-profit organizations providing a direct response in the Ukrainian crisis context, i.e., the Polish Humanitarian Action, Caritas, Polish Medical Mission, Polish Red Cross, and the Polish Center for International Aid and the applicability of theoretical perspective of contingency theory – with its central point that the context or specific set of conditions determining the way of behavior and the choice of methods of action – help to examine the significance of data complexity and adaptive approach to data management by relief organizations in the humanitarian supply chain network. The purpose of this study is to determine how the existence of well-established and accurate internal procedures and good practices of using and sharing data (including safeguards for sensitive data) by the surveyed organizations with comparable human and technological capabilities are implemented and adjusted to Ukrainian humanitarian settings and data infrastructure. The study also poses a fundamental question of whether this crisis experience will have a determining effect on their future performance. The obtained finding indicate that Polish humanitarian organizations in Ukraine, which have their own unique code of conduct and effective managerial data practices determined by contingencies, have limited influence on improving the situational awareness of other assistance providers in the data ecosystem despite their attempts to undertake interagency work in the area of data sharing.

Keywords: humanitarian data ecosystem, humanitarian data management, polish NGOs, Ukraine

Procedia PDF Downloads 88
25785 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases

Authors: Daniel C. Bonzo

Abstract:

Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.

Keywords: clustered data, estimand, extrapolation, mixed model

Procedia PDF Downloads 129
25784 Leisure, Domestic or Professional Activities so as to Prevent Cognitive Decline: Results FreLE Longitudinal Study

Authors: Caroline Dupre, David Hupin, Christ Goumou, Francois Belan, Frederic Roche, Thomas Celarier, Bienvenu Bongue

Abstract:

Background: Previous cohorts have been notably criticized for not studying the different type of physical activity and not investigating household activities. The objective of this work was to analyse the relationship between physical activity and cognitive decline in older people living in the community. Impact of type of physical activity on the results has been realised. Methods: The study used data from the longitudinal and observational study , FrèLE (FRagility: Longitudinal Study of Expressions). The collected data included: socio-demographic variables, lifestyle, and health status (frailty, comorbidities, cognitive status, depression). Cognitive decline was assessed by using: Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA). Physical activity was assessed by the Physical Activity Scale for the Elderly (PASE). This tool is structured in three sections: the leisure activity, domestic activity, and professional activity. Logistic regressions and proportional hazards regression models (Cox) were used to estimate the risk of cognitive disorders. Results: At baseline, the prevalence of cognitive disorders was 6.9% according to MMSE. In total, 1167 participants without cognitive disorders were included in the analysis. The mean age was 77.4 years, and 52.1% of the participants were women. After a 2 years long follow-up, we found cognitive disorders on 53 participants (4.5%). Physical activity at baseline is lower in older adults for whom cognitive decline was observed after two years of follow-up. Subclass analyses showed that leisure and domestic activities were associated with cognitive decline, but not professional activities. Conclusions: Analysis showed a relationship between cognitive disorders and type of physical activity. The current study will be completed by the MoCA for mild cognitive impairment. These findings compared to other ongoing studies, will contribute to the debate on the beneficial effects of physical activity on cognition.

Keywords: aging, cognitive function, physical activity, mixed models

Procedia PDF Downloads 120
25783 Phylogenetic Relationships of Aproaerema Simplexella (Walker) and the Groundnut Leaf Miner Aproaerema Modicella (Deventer) (Lepidoptera: Gelechiidae) Collected from Australia, India, Mozambique, and South Africa

Authors: Makhosi Buthelezi

Abstract:

Mitochondrial DNA cytochrome c oxidase I (COI) gene analyses linked the South African groundnut leaf miner (GLM) to the Australian soya bean moth Aproaerema simplexella (Walker) and Indian Aproaerema modicella (Deventer). Thus, the genetic relatedness of GLM, A. simplexela, and A. modicella was examined by performing mitochondrial and nuclear (COI, cytochrome oxidase subunit II (COII), mitochondrial cytochrome b (CYTB), nuclear ribosomal 28S (28S) and intergenic spacer elongation factor-1 alpha ( EF-1 ALPHA) on 44 specimens collected from South Africa, four from Mozambique, and three each from single locations in India and Australia. Phylogenetic analyses were conducted using the Maximum Parsimony (MP) and Neighbour-Joining (NJ) methods. All of the datasets of the five DNA gene regions that were sequenced were also analyzed using the Basic Local Alignment Search Tool (BLAST) to find the closest matches for inclusion in the phylogenetic trees as outgroups and for purposes of information. In the phylogenetic trees for COI, COII, cytb and EF-1 ALPHA, a similar pattern was observed in the way that the sequences assembled into different groups; i.e., some sequences of A. simplexella from Australia were grouped separately from the others, but some Australian sequences grouped with those of the GLM from South Africa, India, and Mozambique. In the phylogenetic tree for 28S, all sequences from South Africa, Australia, India, and Mozambique grouped together and formed one group. For COI, genetic pairwise distance ranged from 0.97 to 3.60 %, for COII it ranged from 0.19% to 2.32%, for cytb it ranged from 0.25 to 9.77% and for EF-1 ALPHA it ranged 0.48 to 6.99%. Results of this study indicate that these populations are genetically related and presumably constitute a single species. Thus, further molecular and morphological studies need to be undertaken in order to resolve this apparent conundrum on the taxonomy of these populations.

Keywords: aproaerema modicella, aproaerema simplexella, mitochondrial DNA, nuclear DNA

Procedia PDF Downloads 191
25782 Estimation of Adult Patient Doses for Chest X-Ray Diagnostic Examinations in a Tertiary Institution Health Centre

Authors: G. E. Okungbowa, H. O. Adams, S. E. Eze

Abstract:

This study is on the estimation of adult patient doses for Chest X-ray diagnostic examinations of new admitted undergraduate students attending a tertiary institution health centre as part of their routine clearance and check up on admitted into the institution. A total of 531 newly admitted undergraduate students were recruited for this survey in the first quarter of 2016 (January to March, 2016). CALDOSE_X 5.0 software was used to compute the Entrance Surface Dose (ESD) and Effective Dose (ED); while the Statistical Package for Social Sciences (SPSS) version 21.0 was used to carry out the statistical analyses. The basic patients' data and exposure parameters required for the software are age, sex, examination type, projection posture, tube potential and current-time product. The mean Entrance Surface Dose and Effective Doses of the undergraduate students were calculated using the software, and the values were compared with existing literature and internationally established diagnostic reference levels. The mean ESD calculated is 0.29 mGy, and the mean effective dose is 0.04 mSv. The values of ESD and ED obtained are below the internationally established diagnostic reference levels, which could be attributed to good radiographic techniques employed during the chest X-ray procedure for these students.

Keywords: x-ray, dose, examination, chest

Procedia PDF Downloads 179
25781 Authorization of Commercial Communication Satellite Grounds for Promoting Turkish Data Relay System

Authors: Celal Dudak, Aslı Utku, Burak Yağlioğlu

Abstract:

Uninterrupted and continuous satellite communication through the whole orbit time is becoming more indispensable every day. Data relay systems are developed and built for various high/low data rate information exchanges like TDRSS of USA and EDRSS of Europe. In these missions, a couple of task-dedicated communication satellites exist. In this regard, for Turkey a data relay system is attempted to be defined exchanging low data rate information (i.e. TTC) for Earth-observing LEO satellites appointing commercial GEO communication satellites all over the world. First, justification of this attempt is given, demonstrating duration enhancements in the link. Discussion of preference of RF communication is, also, given instead of laser communication. Then, preferred communication GEOs – including TURKSAT4A already belonging to Turkey- are given, together with the coverage enhancements through STK simulations and the corresponding link budget. Also, a block diagram of the communication system is given on the LEO satellite.

Keywords: communication, GEO satellite, data relay system, coverage

Procedia PDF Downloads 433
25780 The Development of Encrypted Near Field Communication Data Exchange Format Transmission in an NFC Passive Tag for Checking the Genuine Product

Authors: Tanawat Hongthai, Dusit Thanapatay

Abstract:

This paper presents the development of encrypted near field communication (NFC) data exchange format transmission in an NFC passive tag for the feasibility of implementing a genuine product authentication. We propose a research encryption and checking the genuine product into four major categories; concept, infrastructure, development and applications. This result shows the passive NFC-forum Type 2 tag can be configured to be compatible with the NFC data exchange format (NDEF), which can be automatically partially data updated when there is NFC field.

Keywords: near field communication, NFC data exchange format, checking the genuine product, encrypted NFC

Procedia PDF Downloads 272
25779 The Association of Slope Failure and Lineament Density along the Ranau-Tambunan Road, Sabah, Malaysia

Authors: Norbert Simon, Rodeano Roslee, Abdul Ghani Rafek, Goh Thian Lai, Azimah Hussein, Lee Khai Ern

Abstract:

The 54 km stretch of Ranau-Tambunan (RTM) road in Sabah is subjected to slope failures almost every year. This study is focusing on identifying section of roads that are susceptible to failure based on temporal landslide density and lineament density analyses. In addition to the analyses, the rock slopes in several sections of the road were assessed using the geological strength index (GSI) technique. The analysis involved 148 landslides that were obtained in 1978, 1994, 2009 and 2011. The landslides were digitized as points and the point density was calculated based on every 1km2 of the road. The lineaments of the area was interpreted from Landsat 7 15m panchromatic band. The lineament density was later calculated based on every 1km2 of the area using similar technique with the slope failure density calculation. The landslide and lineament densities were classified into three different classes that indicate the level of susceptibility (low, moderate, high). Subsequently, the two density maps were overlap to produce the final susceptibility map. The combination of both high susceptibility classes from these maps signifies the high potential of slope failure in those locations in the future. The final susceptibility map indicates that there are 22 sections of the road that are highly susceptible. Seven rock slopes were assessed along the RTM road using the GSI technique. It was found from the assessment that rock slopes along this road are highly fractured, weathered and can be classified into fair to poor categories. The poor condition of the rock slope can be attributed to the high lineament density that presence in the study area. Six of the rock slopes are located in the high susceptibility zones. A detailed investigation on the 22 high susceptibility sections of the RTM road should be conducted due to their higher susceptibility to failure, in order to prevent untoward incident to road users in the future.

Keywords: GSI, landslide, landslide density, landslide susceptibility, lineament density

Procedia PDF Downloads 391
25778 Critical Factors Boosting the Future Economy of Eritrea: An Empirical Approach

Authors: Biniam Tedros Kahsay, Yohannes Yebabe Tesfay

Abstract:

Eritrea is a country in the East of Africa. The country is a neighbor of Djibouti, Ethiopia, and Sudan and is bordered by the Red Sea. The country declared its independence from Ethiopia in 1993. Thus, Eritrea has a lot of commonalities with the Northern Part of Ethiopia's tradition, religion, and languages. Many economists suggested that Eritrea is in a very strategic position for world trade roots and has an impact on geopolitics. This study focused on identifying the most important factor in boosting the Eritrean Economy. The paper collected big secondary data from the World Bank, International Trade and Tariff Data (WTO), East African Community (EAC), Ethiopian Statistical Agency (ESA), and the National Statistics Office (Eritrea). Economists consider economic and population growth in determining trade belts in East Africa. One of the most important Trade Belt that will potentially boost the Eritrean economy is the root of Eritrea (Massawa)->Eritea, (Asmara)->Tigray, (Humora)->Tigray, (Dansha)-> Gondar-> Gojjam-> Benshangual Gumuz => {Oromia, South Sudan}->Uganda. The estimate showed that this is one of the biggest trade roots in East Africa and has a participation of more than 150 million people. We employed various econometric analyses to predict the GDP of Eritrea, considering the future trade belts in East Africa. The result showed that the economy of Eritrea from the Trade Belt will have an elasticity estimate of 65.87% of the GDP of Ethiopia, 3.32% of the GDP of South Sudan, and 0.09% of the GDP of Uganda. The result showed that the existence of war has an elasticity of -93% to the GDP of the country. Thus, if Eritrea wants to strengthen its economy from the East African Trade Belt, the country needs to permanently avoid war in the region. Essentially, the country needs to establish a collaborative platform with the Northern part of Ethiopia (Tigray). Thus, establishing a mutual relationship with Tigray will boost the Eritrean economy. In that regard, Eritrean scholars and policymakers need to work on establishing the East African Trade Belt to boost their economy.

Keywords: Eritrea, east Africa trade belt, GDP, cointegration analysis, critical path analysis

Procedia PDF Downloads 51
25777 Archaeological Study of Statues of King Thutmosis III from Luxor

Authors: Mahmoud Abualsoud

Abstract:

The era of Thutmosis III represents a transitional period between the art of the Thutmoside art and the Amarna period, so we intend to declare that it serves as the cradle of Amarna art. The study will examine the Statues of king Thutmose III that was discovered in Luxor by an Egyptian mission. These Statues have been transferred to the Conservation Center of the Grand Egyptian Museum (GEM) to be conserved and made ready to be displayed at the new museum (the project of the century). We focus on three Statues chosen because they relate to different years of the king's reign. These Statues were all made of granite. The first one is a Kneeling statue representing the god Amun showing king Thutmose III offering to the goddess Hathor. The second is decorated with king Thutmose III with the red crown, between the goddess Hathor and the royal wife, Nefertari. The third shows the king offering NW vessels and bread to the god Seker. Each statue is divided into registers containing a description and decorated with scenes of the king presenting offerings to gods. The proposed study will focus on the development which happened sequentially according to differences that occur in each statue. We will use comparative research to determine the workshops of these statues, whether one or several, and what are the distinguishing features of each one. We will examine what innovations the artisans added to royal art. The description and the texts will be translated with linguistic comments. This research focuses on text analyses and technology. Paleographic information found on these objects includes the names and titles of the king. This research focuses on text analyses and technology. The study aims to create a manual that may help in dating the artwork of Thutmosis III. This research will be beneficial and useful for heritage and ancient civilizations, particularly when we talk about opening museums like the Grand Egyptian Museum, which will exhibit a collection of statues. Indeed, this kind of study will open a new destination in order to know how to identify these collections and how to exhibit them commensurate with the nature of ancient Egyptian history and heritage.

Keywords: archaeological study, Giza, new kingdom, statues, royal art

Procedia PDF Downloads 65
25776 Data Hiding by Vector Quantization in Color Image

Authors: Yung Gi Wu

Abstract:

With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.

Keywords: data hiding, vector quantization, watermark, color image

Procedia PDF Downloads 358
25775 Performance Based Seismic Retrofit of Masonry Infiled Reinforced Concrete Frames Using Passive Energy Dissipation Devices

Authors: Alok Madan, Arshad K. Hashmi

Abstract:

The paper presents a plastic analysis procedure based on the energy balance concept for performance based seismic retrofit of multi-story multi-bay masonry infilled reinforced concrete (R/C) frames with a ‘soft’ ground story using passive energy dissipation (PED) devices with the objective of achieving a target performance level of the retrofitted R/C frame for a given seismic hazard level at the building site. The proposed energy based plastic analysis procedure was employed for developing performance based design (PBD) formulations for PED devices for a simulated application in seismic retrofit of existing frame structures designed in compliance with the prevalent standard codes of practice. The PBD formulations developed for PED devices were implemented for simulated seismic retrofit of a representative code-compliant masonry infilled R/C frame with a ‘soft’ ground story using friction dampers as the PED device. Non-linear dynamic analyses of the retrofitted masonry infilled R/C frames is performed to investigate the efficacy and accuracy of the proposed energy based plastic analysis procedure in achieving the target performance level under design level earthquakes. Results of non-linear dynamic analyses demonstrate that the maximum inter-story drifts in the masonry infilled R/C frames with a ‘soft’ ground story that is retrofitted with the friction dampers designed using the proposed PBD formulations are controlled within the target drifts under near-field as well far-field earthquakes.

Keywords: energy methods, masonry infilled frame, near-field earthquakes, seismic protection, supplemental damping devices

Procedia PDF Downloads 293
25774 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 185
25773 Analysis of NMDA Receptor 2B Subunit Gene (GRIN2B) mRNA Expression in the Peripheral Blood Mononuclear Cells of Alzheimer's Disease Patients

Authors: Ali̇ Bayram, Semih Dalkilic, Remzi Yigiter

Abstract:

N-methyl-D-aspartate (NMDA) receptor is a subtype of glutamate receptor and plays a pivotal role in learning, memory, neuronal plasticity, neurotoxicity and synaptic mechanisms. Animal experiments were suggested that glutamate-induced excitotoxic injuriy and NMDA receptor blockage lead to amnesia and other neurodegenerative diseases including Alzheimer’s disease (AD), Huntington’s disease, amyotrophic lateral sclerosis. Aim of this study is to investigate association between NMDA receptor coding gene GRIN2B expression level and Alzheimer disease. The study was approved by the local ethics committees, and it was conducted according to the principles of the Declaration of Helsinki and guidelines for the Good Clinical Practice. Peripheral blood was collected 50 patients who diagnosed AD and 49 healthy control individuals. Total RNA was isolated with RNeasy midi kit (Qiagen) according to manufacturer’s instructions. After checked RNA quality and quantity with spectrophotometer, GRIN2B expression levels were detected by quantitative real time PCR (QRT-PCR). Statistical analyses were performed, variance between two groups were compared with Mann Whitney U test in GraphpadInstat algorithm with 95 % confidence interval and p < 0.05. After statistical analyses, we have determined that GRIN2B expression levels were down regulated in AD patients group with respect to control group. But expression level of this gene in each group was showed high variability. İn this study, we have determined that NMDA receptor coding gene GRIN2B expression level was down regulated in AD patients when compared with healthy control individuals. According to our results, we have speculated that GRIN2B expression level was associated with AD. But it is necessary to validate these results with bigger sample size.

Keywords: Alzheimer’s disease, N-methyl-d-aspartate receptor, NR2B, GRIN2B, mRNA expression, RT-PCR

Procedia PDF Downloads 386
25772 Trends of Cutaneous Melanoma in New Zealand: 2010 to 2020

Authors: Jack S. Pullman, Daniel Wen, Avinash Sharma, Bert Van Der Werf, Richard Martin

Abstract:

Background: New Zealand (NZ) melanoma incidence rates are amongst the highest in the world. Previous studies investigating the incidence of melanoma in NZ were performed for the periods 1995 – 1999 and 2000 – 2004 and suggested increasing melanoma incidence rates. Aim: The aim of the study is to provide an up-to-date review of trends in cutaneous melanoma in NZ from the New Zealand Cancer Registry (NZCR) 2010 – 2020. Methods: De-identified data were obtained from the NZCR, and relevant demographic and histopathologic information was extracted. Statistical analyses were conducted to calculate age-standardized incidence rates for invasive melanoma (IM) and melanoma in situ (MIS). Secondary results included Breslow thickness and melanoma subtype analysis. Results: There was a decline in the IM age-standardized incidence rate from 30.4 to 23.9 per 100,000 person-years between 2010 to 2020, alongside an increase in MIS incidence rate from 37.1 to 50.3 per 100,000 person-years. Men had a statistically significant higher IM incidence rate (p <0.001) and Breslow thickness (p <0.001) compared with women. Increased age was associated with a higher incidence of IM, presentation with melanoma of greater Breslow thickness and more advanced T stage. Conclusion: The incidence of IM in NZ has decreased in the last decade and was associated with an increase in MIS incidence over the same period. This can be explained due to earlier detection, dermoscopy, the maturity of prevention campaigns and/or a change in skin protection behavior.

Keywords: melanoma, incidence, epidemiology, New Zealand

Procedia PDF Downloads 58
25771 Application of Artificial Ground-Freezing to Construct a Passenger Interchange Tunnel for the Subway Line 14 in Paris, France

Authors: G. Lancellotta, G. Di Salvo, A. Rigazio, A. Davout, V. Pastore, G. Tonoli, A. Martin, P. Jullien, R. Jagow-Klaff, R. Wernecke

Abstract:

Artificial ground freezing (AGF) technique is a well-proven soil improvement approach used worldwide to construct shafts, tunnels and many other civil structures in difficult subsoil or ambient conditions. As part of the extension of Line 14 of the Paris subway, a passenger interchange tunnel between the new station at Porte de CI ichy and the new Tribunal the Grand Instance has been successfully constructed using this technique. The paper presents the successful application of AGF by Liquid Nitrogen and Brine implemented to provide structural stability and groundwater cut-off around the passenger interchange tunnel. The working conditions were considered to be rather challenging, due to the proximity of a hundred-year-old existing service tunnel of the Line 13, and subsoil conditions on site. Laboratory tests were carried out to determine the relevant soil parameters for hydro-thermal-mechanical aspects and to implement numerical analyses. Monitoring data were used in order to check and control the development and the efficiency of the freezing process as well as to back analyze the parameters assumed for the design, both during the freezing and thawing phases.

Keywords: artificial ground freezing, brine method, case history, liquid nitrogen

Procedia PDF Downloads 218
25770 Second Order Journalism: A Study of Selected Niche Authorities on Facebook and Twitter

Authors: Yvonne Dedzo

Abstract:

Social media has become a powerful tool in bridging the distance between individuals regardless of their location. It has become a convenient platform for public discussion and, consequently, generated the phenomenon of citizen journalists who have become both proactive and reactive participants in the dissemination of news, information and other epochal and historical events. This phenomenon has fueled the growth of niche authorities who deliver exceptional democratically consequential information online. This study, therefore, investigates how some selected niche authorities maintain their status on social media. Using the selective processes theory, the study further interrogates the information shared by niche authorities and further analyses the extent to which a public interest-altruistic motive or personal interest-self-serving motive drives their agenda of new sharing and usage. Through cyber-ethnography and, qualitative content analysis and semi-structured interviews, data was gathered and analysed from the posts of two purposely selected niche authorities on Facebook and Twitter. The findings indicate that niche authorities maintain their status by being consistent, prompt, informative, resourceful and interactive in their postings on the social media platform. The study also discovered that even though niche authorities are motivated by both public interest-altruism and interest-self-serving, the latter had a higher of motivation than the former.

Keywords: social medida, citizen journalist, niche authorities, selective processes theory

Procedia PDF Downloads 57
25769 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 306
25768 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.

Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making

Procedia PDF Downloads 69
25767 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 249
25766 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 68
25765 Natural and Construction/Demolition Waste Aggregates: A Comparative Study

Authors: Debora C. Mendes, Matthias Eckert, Claudia S. Moço, Helio Martins, Jean-Pierre Gonçalves, Miguel Oliveira, Jose P. Da Silva

Abstract:

Disposal of construction and demolition waste (C&DW) in embankments in the periphery of cities causes both environmental and social problems. To achieve the management of C&DW, a detailed analysis of the properties of these materials should be done. In this work we report a comparative study of the physical, chemical and environmental properties of natural and C&DW aggregates from 25 different origins. Assays were performed according to European Standards. Analysis of heavy metals and organic compounds, namely polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs), were performed. Finally, properties of concrete prepared with C&DW aggregates are reported. Physical analyses of C&DW aggregates indicated lower quality properties than natural aggregates, particularly for concrete preparation and unbound layers of road pavements. Chemical properties showed that most samples (80%) meet the values required by European regulations for concrete and unbound layers of road pavements. Analyses of heavy metals Cd, Cr, Cu, Pb, Ni, Mo and Zn in the C&DW leachates showed levels below the limits established by the Council Decision of 19 December 2002. Identification and quantification of PCBs and PAHs indicated that few samples shows the presence of these compounds. The measured levels of PCBs and PAHs are also below the limits. Other compounds identified in the C&DW leachates include phthalates and diphenylmethanol. The characterized C&DW aggregates show lower quality properties than natural aggregates but most samples showed to be environmentally safe. A continuous monitoring of the presence of heavy metals and organic compounds should be made to trial safe C&DW aggregates. C&DW aggregates provide a good economic and environmental alternative to natural aggregates.

Keywords: concrete preparation, construction and demolition waste, heavy metals, organic pollutants

Procedia PDF Downloads 352
25764 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 156
25763 Spatial Cluster Analysis of Human Cases of Crimean Congo Hemorrhagic Fever Reported in Pakistan

Authors: Tariq Abbas, Younus Muhammad, Sayyad Aun Muhammad

Abstract:

Background : Crimean Congo hemorrhagic fever (CCHF) is a tick born viral zoonotic disease that has been notified from almost all regions of Pakistan. The aim of this study was to investigate spatial distribution of CCHF cases reported to National Institue of Health , Islamabad during year 2013. Methods : Spatial statistics tools were applied to detect extent spatial auto-correlation and clusters of the disease based on adjusted cumulative incidence per million population for each district. Results : The data analyses revealed a large multi-district cluster of high values in the uplands of Balochistan province near Afghanistan border. Conclusion : The cluster included following districts: Pishin; Qilla Abdullah; Qilla Saifullah; Quetta, Sibi; Zhob; and Ziarat. These districts may be given priority in CCHF surveillance, control programs, and further epidemiological research . The location of the cluster close to border of Afghanistan and Iran highlight importance of the findings for organizations dealing with disease at national, regional and global levels.

Keywords: Crimean Congo hemorrhagic fever, Pakistan, spatial autocorrelation, clusters , adjusted cumulative incidence

Procedia PDF Downloads 404
25762 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 103
25761 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 416