Search results for: coding complexity metric mccabe
2103 Uplink Throughput Prediction in Cellular Mobile Networks
Authors: Engin Eyceyurt, Josko Zec
Abstract:
The current and future cellular mobile communication networks generate enormous amounts of data. Networks have become extremely complex with extensive space of parameters, features and counters. These networks are unmanageable with legacy methods and an enhanced design and optimization approach is necessary that is increasingly reliant on machine learning. This paper proposes that machine learning as a viable approach for uplink throughput prediction. LTE radio metric, such as Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), and Signal to Noise Ratio (SNR) are used to train models to estimate expected uplink throughput. The prediction accuracy with high determination coefficient of 91.2% is obtained from measurements collected with a simple smartphone application.Keywords: drive test, LTE, machine learning, uplink throughput prediction
Procedia PDF Downloads 1582102 Application of EEG Wavelet Power to Prediction of Antidepressant Treatment Response
Authors: Dorota Witkowska, Paweł Gosek, Lukasz Swiecicki, Wojciech Jernajczyk, Bruce J. West, Miroslaw Latka
Abstract:
In clinical practice, the selection of an antidepressant often degrades to lengthy trial-and-error. In this work we employ a normalized wavelet power of alpha waves as a biomarker of antidepressant treatment response. This novel EEG metric takes into account both non-stationarity and intersubject variability of alpha waves. We recorded resting, 19-channel EEG (closed eyes) in 22 inpatients suffering from unipolar (UD, n=10) or bipolar (BD, n=12) depression. The EEG measurement was done at the end of the short washout period which followed previously unsuccessful pharmacotherapy. The normalized alpha wavelet power of 11 responders was markedly different than that of 11 nonresponders at several, mostly temporoparietal sites. Using the prediction of treatment response based on the normalized alpha wavelet power, we achieved 81.8% sensitivity and 81.8% specificity for channel T4.Keywords: alpha waves, antidepressant, treatment outcome, wavelet
Procedia PDF Downloads 3162101 An Audit of the Diagnosis of Asthma in Children in Primary Care and the Emergency Department
Authors: Abhishek Oswal
Abstract:
Background: Inconsistencies between the guidelines for childhood asthma can pose a diagnostic challenge to clinicians. NICE guidelines are the most commonly followed guidelines in primary care in the UK; they state that to be diagnosed with asthma, a child must be more than 5 years old and must have objective evidence of the disease. When diagnoses are coded in general practice (GP), these guidelines may be superseded by communications from secondary care. Hence it is imperative that diagnoses are correct, as per up to date guidelines and evidence, as this affects follow up and management both in primary and secondary care. Methods: A snapshot audit at a general practice surgery was undertaken of children (less than 16 years old) with a coded diagnosis of 'asthma', to review the age at diagnosis and whether any objective evidence of asthma was documented at diagnosis. 50 cases of asthma in children presenting to the emergency department (ED) were then audited to review the age at presentation, whether there was evidence of previous asthma diagnosis and whether the patient was discharged from ED. A repeat audit is planned in ED this winter. Results: In a GP surgery, there were 83 coded cases of asthma in children. 51 children (61%) were diagnosed under 5, with 9 children (11%) who had objective evidence of asthma documented at diagnosis. In ED, 50 cases were collected, of which 4 were excluded as they were referred to the other services, or for incorrect coding. Of the 46 remaining, 27 diagnoses confirmed to NICE guidelines (59%). 33 children (72%) were discharged from ED. Discussion: The most likely reason for the apparent low rate of a correct diagnosis is the significant challenge of obtaining objective evidence of asthma in children. There were a number of patients who were diagnosed from secondary care services and then coded as 'asthma' in GP, without having objective documented evidence. The electronic patient record (EPR) system used in our emergency department (ED) did not allow coding of 'suspected diagnosis' or of 'viral induced wheeze'. This may have led to incorrect diagnoses coded in primary care, of children who had no confirmed diagnosis of asthma. We look forward to the re-audit, as the EPR system has been updated to allow suspected diagnoses. In contrast to the NICE guidelines used here, British Thoracic Society (BTS) guidelines allow for a trial of treatment and subsequent confirmation of diagnosis without objective evidence. It is possible that some of the cases which have been classified as incorrect in this audit may still meet other guidelines. Conclusion: The diagnosis of asthma in children is challenging. Incorrect diagnoses may be related to clinical pressures and the provision of services to allow compliance with NICE guidelines. Consensus statements between the various groups would also aid the decision-making process and diagnostic dilemmas that clinicians face, to allow more consistent care of the patient.Keywords: asthma, diagnosis, primary care, emergency department, guidelines, audit
Procedia PDF Downloads 1442100 Awareness for Air Pollution Impacts on Lung Cancer in Southern California: A Pilot Study for Designed Smartphone Application
Authors: M. Mohammed Raoof, A. Enkhtaivan, H. Aljuaid
Abstract:
This study follows the design science research methodology to design and implement a smartphone application artifact. The developed artifact was evaluated through three phases. The System Usability Scale (SUS) metric was used for the evaluation. The designed artifact aims to spread awareness about reducing air pollution, decreasing lung cancer development, and checking the air quality status in Southern California Counties. Participants have been drawn for a pilot study to facilitate awareness of air pollution. The study found that smartphone applications have a beneficial effect on the study’s aims.Keywords: air pollution, design science research, indoor air pollution, lung cancer, outdoor air pollution, smartphone application
Procedia PDF Downloads 1202099 Copy Number Variants in Children with Non-Syndromic Congenital Heart Diseases from Mexico
Authors: Maria Lopez-Ibarra, Ana Velazquez-Wong, Lucelli Yañez-Gutierrez, Maria Araujo-Solis, Fabio Salamanca-Gomez, Alfonso Mendez-Tenorio, Haydeé Rosas-Vargas
Abstract:
Congenital heart diseases (CHD) are the most common congenital abnormalities. These conditions can occur as both an element of distinct chromosomal malformation syndromes or as non-syndromic forms. Their etiology is not fully understood. Genetic variants such copy number variants have been associated with CHD. The aim of our study was to analyze these genomic variants in peripheral blood from Mexican children diagnosed with non-syndromic CHD. We included 16 children with atrial and ventricular septal defects and 5 healthy subjects without heart malformations as controls. To exclude the most common heart disease-associated syndrome alteration, we performed a fluorescence in situ hybridization test to identify the 22q11.2, responsible for congenital heart abnormalities associated with Di-George Syndrome. Then, a microarray based comparative genomic hybridization was used to identify global copy number variants. The identification of copy number variants resulted from the comparison and analysis between our results and data from main genetic variation databases. We identified copy number variants gain in three chromosomes regions from pediatric patients, 4q13.2 (31.25%), 9q34.3 (25%) and 20q13.33 (50%), where several genes associated with cellular, biosynthetic, and metabolic processes are located, UGT2B15, UGT2B17, SNAPC4, SDCCAG3, PMPCA, INPP6E, C9orf163, NOTCH1, C20orf166, and SLCO4A1. In addition, after a hierarchical cluster analysis based on the fluorescence intensity ratios from the comparative genomic hybridization, two congenital heart disease groups were generated corresponding to children with atrial or ventricular septal defects. Further analysis with a larger sample size is needed to corroborate these copy number variants as possible biomarkers to differentiate between heart abnormalities. Interestingly, the 20q13.33 gain was present in 50% of children with these CHD which could suggest that alterations in both coding and non-coding elements within this chromosomal region may play an important role in distinct heart conditions.Keywords: aCGH, bioinformatics, congenital heart diseases, copy number variants, fluorescence in situ hybridization
Procedia PDF Downloads 2932098 Performance of High Efficiency Video Codec over Wireless Channels
Authors: Mohd Ayyub Khan, Nadeem Akhtar
Abstract:
Due to recent advances in wireless communication technologies and hand-held devices, there is a huge demand for video-based applications such as video surveillance, video conferencing, remote surgery, Digital Video Broadcast (DVB), IPTV, online learning courses, YouTube, WhatsApp, Instagram, Facebook, Interactive Video Games. However, the raw videos posses very high bandwidth which makes the compression a must before its transmission over the wireless channels. The High Efficiency Video Codec (HEVC) (also called H.265) is latest state-of-the-art video coding standard developed by the Joint effort of ITU-T and ISO/IEC teams. HEVC is targeted for high resolution videos such as 4K or 8K resolutions that can fulfil the recent demands for video services. The compression ratio achieved by the HEVC is twice as compared to its predecessor H.264/AVC for same quality level. The compression efficiency is generally increased by removing more correlation between the frames/pixels using complex techniques such as extensive intra and inter prediction techniques. As more correlation is removed, the chances of interdependency among coded bits increases. Thus, bit errors may have large effect on the reconstructed video. Sometimes even single bit error can lead to catastrophic failure of the reconstructed video. In this paper, we study the performance of HEVC bitstream over additive white Gaussian noise (AWGN) channel. Moreover, HEVC over Quadrature Amplitude Modulation (QAM) combined with forward error correction (FEC) schemes are also explored over the noisy channel. The video will be encoded using HEVC, and the coded bitstream is channel coded to provide some redundancies. The channel coded bitstream is then modulated using QAM and transmitted over AWGN channel. At the receiver, the symbols are demodulated and channel decoded to obtain the video bitstream. The bitstream is then used to reconstruct the video using HEVC decoder. It is observed that as the signal to noise ratio of channel is decreased the quality of the reconstructed video decreases drastically. Using proper FEC codes, the quality of the video can be restored up to certain extent. Thus, the performance analysis of HEVC presented in this paper may assist in designing the optimized code rate of FEC such that the quality of the reconstructed video is maximized over wireless channels.Keywords: AWGN, forward error correction, HEVC, video coding, QAM
Procedia PDF Downloads 1492097 Virtual Reality Design Platform to Easily Create Virtual Reality Experiences
Authors: J. Casteleiro- Pitrez
Abstract:
The interest in Virtual Reality (VR) keeps increasing among the community of designers. To develop this type of immersive experience, the understanding of new processes and methodologies is as fundamental as its complex implementation which usually implies hiring a specialized team. In this paper, we introduce a case study, a platform that allows designers to easily create complex VR experiences, present its features, and its development process. We conclude that this platform provides a complete solution for the design and development of VR experiences, no-code needed.Keywords: creatives, designers, virtual reality, virtual reality design platform, virtual reality system, no-coding
Procedia PDF Downloads 1582096 A Method of the Semantic on Image Auto-Annotation
Authors: Lin Huo, Xianwei Liu, Jingxiong Zhou
Abstract:
Recently, due to the existence of semantic gap between image visual features and human concepts, the semantic of image auto-annotation has become an important topic. Firstly, by extract low-level visual features of the image, and the corresponding Hash method, mapping the feature into the corresponding Hash coding, eventually, transformed that into a group of binary string and store it, image auto-annotation by search is a popular method, we can use it to design and implement a method of image semantic auto-annotation. Finally, Through the test based on the Corel image set, and the results show that, this method is effective.Keywords: image auto-annotation, color correlograms, Hash code, image retrieval
Procedia PDF Downloads 4972095 Identification and Molecular Profiling of A Family I Cystatin Homologue from Sebastes schlegeli Deciphering Its Putative Role in Host Immunity
Authors: Don Anushka Sandaruwan Elvitigala, P. D. S. U. Wickramasinghe, Jehee Lee
Abstract:
Cystatins are a large superfamily of proteins which act as reversible inhibitors of cysteine proteases. Papain proteases and cysteine cathepsins are predominant substrates of cystatins. Cystatin superfamily can be further clustered into three groups as Stefins, Cystatins, and Kininogens. Among them, stefines are also known as family 1 cystatins which harbors cystatin Bs and cystatin As. In this study, a homologue of family one cystatins more close to cystatin Bs was identified from Korean black rockfish (Sebastes schlegeli) using a prior constructed cDNA (complementary deoxyribonucleic acid) database and designated as RfCyt1. The full-length cDNA of RfCyt1 consisted of 573 bp, with a coding region of 294 bp. It comprised a 5´-untranslated region (UTR) of 55 bp, and 3´-UTR of 263 bp. The coding sequence encodes a polypeptide consisting of 97 amino acids with a predicted molecular weight of 11kDa and theoretical isoelectric point of 6.3. The RfCyt1 shared homology with other teleosts and vertebrate species and consisted conserved features of cystatin family signature including single cystatin-like domain, cysteine protease inhibitory signature of pentapeptide (QXVXG) consensus sequence and N-terminal two conserved neighboring glycine (⁸GG⁹) residues. As expected, phylogenetic reconstruction developed using the neighbor-joining method showed that RfCyt1 is clustered with the cystatin family 1 members, in which more closely with its teleostan orthologues. An SYBR Green qPCR (quantitative polymerase chain reaction) assay was performed to quantify the RfCytB transcripts in different tissues in healthy and immune stimulated fish. RfCyt1 was ubiquitously expressed in all tissue types of healthy animals with gill and spleen being the highest. Temporal expression of RfCyt1 displayed significant up-regulation upon infection with Aeromonas salmonicida. Recombinantly expressed RfCyt1 showed concentration-dependent papain inhibitory activity. Collectively these findings evidence for detectable protease inhibitory and immunity relevant roles of RfCyt1 in Sebastes schlegeli.Keywords: Sebastes schlegeli, family 1 cystatin, immune stimulation, expressional modulation
Procedia PDF Downloads 1362094 Performance Analysis of SAC-OCDMA System using Different Detectors
Authors: Somaya A. Abd El Mottaleb, Ahmed Abd El Aziz, Heba A. Fayed, Moustafa H. Aly
Abstract:
In this paper, we present the performance of spectral amplitude coding optical code division multiple access using different detectors at different transmission distances using single photodiode detection technique. Modified double weight codes are used as signature codes. Simulation results show that the system using avalanche photo detector can move distance longer than that using positive intrinsic negative photo detector.Keywords: avalanche photodiode, modified double weight, multiple access technique, single photodiode.
Procedia PDF Downloads 6062093 A New Categorization of Image Quality Metrics Based on a Model of Human Quality Perception
Authors: Maria Grazia Albanesi, Riccardo Amadeo
Abstract:
This study presents a new model of the human image quality assessment process: the aim is to highlight the foundations of the image quality metrics proposed in literature, by identifying the cognitive/physiological or mathematical principles of their development and the relation with the actual human quality assessment process. The model allows to create a novel categorization of objective and subjective image quality metrics. Our work includes an overview of the most used or effective objective metrics in literature, and, for each of them, we underline its main characteristics, with reference to the rationale of the proposed model and categorization. From the results of this operation, we underline a problem that affects all the presented metrics: the fact that many aspects of human biases are not taken in account at all. We then propose a possible methodology to address this issue.Keywords: eye-tracking, image quality assessment metric, MOS, quality of user experience, visual perception
Procedia PDF Downloads 4132092 Problems in Computational Phylogenetics: The Germano-Italo-Celtic Clade
Authors: Laura Mclean
Abstract:
A recurring point of interest in computational phylogenetic analysis of Indo-European family trees is the inference of a Germano-Italo-Celtic clade in some versions of the trees produced. The presence of this clade in the models is intriguing as there is little evidence for innovations shared among Germanic, Italic, and Celtic, the evidence generally used in the traditional method to construct a subgroup. One source of this unexpected outcome could be the input to the models. The datasets in the various models used so far, for the most part, take as their basis the Swadesh list, a list compiled by Morris Swadesh and then revised several times, containing up to 207 words that he believed were resistant to change among languages. The judgments made by Swadesh for this list, however, were subjective and based on his intuition rather than rigorous analysis. Some scholars used the Swadesh 200 list as the basis for their Indo-European dataset and made cognacy judgements for each of the words on the list. Another dataset is largely based on the Swadesh 207 list as well although the authors include additional lexical and non-lexical data, and they implement ‘split coding’ to deal with cases of polymorphic characters. A different team of scholars uses a different dataset, IECoR, which combines several different lists, one of which is the Swadesh 200 list. In fact, the Swadesh list is used in some form in every study surveyed and each dataset has three words that, when they are coded as cognates, seemingly contribute to the inference of a Germano-Italo-Celtic clade which could happen due to these clades sharing three words among only themselves. These three words are ‘fish’, ‘flower’, and ‘man’ (in the case of ‘man’, one dataset includes Lithuanian in the cognacy coding and removes the word ‘man’ from the screened data). This collection of cognates shared among Germanic, Italic, and Celtic that were deemed important enough to be included on the Swadesh list, without the ability to account for possible reasons for shared cognates that are not shared innovations, gives an impression of affinity between the Germanic, Celtic, and Italic branches without adequate methodological support. However, by changing how cognacy is defined (ie. root cognates, borrowings vs inherited cognates etc.), we will be able to identify whether these three cognates are significant enough to infer a clade for Germanic, Celtic, and Italic. This paper examines the question of what definition of cognacy should be used for phylogenetic datasets by examining the Germano-Italo-Celtic clade as a case study and offers insights into the reconstruction of a Germano-Italo-Celtic clade.Keywords: historical, computational, Italo-Celtic, Germanic
Procedia PDF Downloads 512091 Understanding the Strategies Underpinning the Marketing of E-Cigarettes: A Content Analysis of Video Advertisements
Authors: Laura Struik, Sarah Dow-Fleisner, Robert Janke
Abstract:
Introduction: The use of e-cigarettes, also known as vaping, has risen exponentially among North American youth and young adults (YYA) in recent years and has become a critical public health concern. The marketing strategies used by e-cigarette companies have been associated with the uptick in use among YYA, with video advertisements on TV and other electronic platforms being the most pervasive strategy. It is unknown if or how these advertisements capitalize on the recently documented multi-faceted influences that contribute to the initiation of vaping among this demographic (e.g., stress, anxiety, gender, peers, etc.), which is examined in this study. Methods: This content analysis is phase one of a two-phased research project that aims to inform meaningful approaches to anti-vaping messaging and campaigns. As part of this first phase, a scoping review has been conducted to identify various influences (environmental, cognitive, contextual, social, and emotional) on e-cigarette uptake among YYA. The results of this scoping review will inform the development of a coding framework to analyze the multiple influences present in vaping advertisements, as seen on two popular television channels (Discovery and AMC). In addition, advertisement characteristics will be incorporated into the coding framework (e.g., the number of people present, demographic details, context, and setting, etc.), and analyzed. Findings: Findings will reveal the types of influences being leveraged in vaping advertisements, and identify the underlying messages that may be particularly attractive to YYA. This will contribute to a more nuanced understanding of how e-cigarette companies market their products and to whom. The results will also inform the next phase of this research project, which will encompass an analysis of anti-vaping advertisements and how the underpinning strategies align with those of the pro-vaping advertisements. Conclusions: Findings of this will study bring forward important implications for developing effective anti-vaping messages, and assist public health professionals in providing more comprehensive prevention and cessation support as it relates to e-cigarette use. Understanding which marketing strategies e-cigarette companies use is vital to our understanding of how to combat them. Findings will inform recommendations for public health efforts aimed at curbing e-cigarette use among YYA, and ultimately contribute to the health and well-being of YYA.Keywords: e-cigarettes, youth and young adults, advertisements, public health
Procedia PDF Downloads 1222090 Software Quality Measurement System for Telecommunication Industry in Malaysia
Authors: Nor Fazlina Iryani Abdul Hamid, Mohamad Khatim Hasan
Abstract:
Evolution of software quality measurement has been started since McCall introduced his quality model in year 1977. Starting from there, several software quality models and software quality measurement methods had emerged but none of them focused on telecommunication industry. In this paper, the implementation of software quality measurement system for telecommunication industry was compulsory to accommodate the rapid growth of telecommunication industry. The quality value of the telecommunication related software could be calculated using this system by entering the required parameters. The system would calculate the quality value of the measured system based on predefined quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). Thus, software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.Keywords: software quality, quality measurement, quality model, quality metric, net satisfaction index
Procedia PDF Downloads 5932089 Genome Analyses of Pseudomonas Fluorescens b29b from Coastal Kerala
Authors: Wael Ali Mohammed Hadi
Abstract:
Pseudomonas fluorescens B29B, which has asparaginase enzymatic activity, was isolated from the surface coastal seawater of Trivandrum, India. We report the complete Pseudomonas fluorescens B29B genome sequenced, identified, and annotated from a marine source. We find the genome at most minuscule a 7,331,508 bp single circular chromosome with a GC content of 62.19% and 6883 protein-coding genes. Three hundred forty subsystems were identified, including two predicted asparaginases from the genome analysis of P. fluorescens B29B for further investigation. This genome data will help further industrial biotechnology applications of proteins in general and asparaginase as a target.Keywords: pseudomonas, marine, asparaginases, Kerala, whole-genome
Procedia PDF Downloads 2152088 Arterial Line Use for Acute Type 2 Respiratory Failure
Authors: C. Scurr, J. Jeans, S. Srivastava
Abstract:
Introduction: Acute type two respiratory failure (T2RF) has become a common presentation over the last two decades primarily due to an increase in the prevalence of chronic lung disease. Acute exacerbations can be managed either medically or in combination with non-invasive ventilation (NIV) which should be monitored with regular arterial blood gas samples (ABG). Arterial lines allow more frequent arterial blood sampling with less patient discomfort. We present the experience from a teaching hospital emergency department (ED) and level 2 medical high-dependency unit (HDU) that together form the pathway for management of acute type 2 respiratory failure. Methods: Patients acutely presenting to Charing Cross Hospital, London, with T2RF requiring non-invasive ventilation (NIV) over 14 months (2011 to 2012) were identified from clinical coding. Retrospective data collection included: demographics, co-morbidities, blood gas numbers and timing, if arterial lines were used and who performed this. Analysis was undertaken using Microsoft Excel. Results: Coding identified 107 possible patients. 69 notes were available, of which 41 required NIV for type 2 respiratory failure. 53.6% of patients had an arterial line inserted. Patients with arterial lines had 22.4 ABG in total on average compared to 8.2 for those without. These patients had a similar average time to normalizing pH of (23.7 with arterial line vs 25.6 hours without), and no statistically significant difference in mortality. Arterial lines were inserted by Foundation year doctors, Core trainees, Medical registrars as well as the ICU registrar. 63% of these were performed by the medical registrar rather than ICU, ED or a junior doctor. This is reflected in that the average time until an arterial line was inserted was 462 minutes. The average number of ABGs taken before an arterial line was 2 with a range of 0 – 6. The average number of gases taken if no arterial line was ever used was 7.79 (range of 2-34) – on average 4 times as many arterial punctures for each patient. Discussion: Arterial line use was associated with more frequent arterial blood sampling during each inpatient admission. Additionally, patients with an arterial line have less individual arterial punctures in total and this is likely more comfortable for the patient. Arterial lines are normally sited by medical registrars, however this is normally after some delay. ED clinicians could improve patient comfort and monitoring thus allowing faster titration of NIV if arteral lines were regularly inserted in the ED. We recommend that ED doctors insert arterial lines when indicated in order improve the patient experience and facilitate medical management.Keywords: non invasive ventilation, arterial blood gas, acute type, arterial line
Procedia PDF Downloads 4292087 Reconceptualizing Bioeconomy: From the Hegemonic Vision to Diverse Economies and Economies-others for Life – Advocating for a Resilient and Just Future in Colombia
Authors: Alexander Rincón Ruiz
Abstract:
This article is based on an exhaustive review and interdisciplinary effort spanning three years. It involved interviews, dialogues, discussion panels, and collective work on various visions of bio-economy in Colombia. The dialogue included government institutions, universities, local communities, activist groups, research institutes, the productive sector, and politicians, integrating perspectives such as Latin American environmental thought, complexity theory, modern visions, local worldviews (Afro-Colombian, indigenous, peasant), decoloniality, political ecology, ecological economics, and environmental economies. This work highlighted the need to redefine the traditional bio-economy concept, typically focused on markets and biotechnology, and to revisit the original idea of a bio-economy as an ‘economy for life’. In a country as diverse as Colombia—both biophysically and in its varied relationships with the territory—this redefinition is crucial. It emphasizes alternative logics of well-being related to resilience, care, and cooperation, reflecting Indigenous, Afro-Colombian, and peasant worldviews. This article is significant for proposing, for the first time, a viable approach to diverse and alternative economies for life tailored to the Colombian context. It represents not only academic work but also a political commitment to inclusion and plurality, aligning with the Colombian context and potentially extendable to other regions.Keywords: ecological economics, decoloniality, complexity, Biodiversity
Procedia PDF Downloads 382086 Classification of Cosmological Wormhole Solutions in the Framework of General Relativity
Authors: Usamah Al-Ali
Abstract:
We explore the effect of expanding space on the exoticity of the matter supporting a traversable Lorentzian wormhole of zero radial tide whose line element is given by ds2 = dt^2 − a^2(t)[ dr^2/(1 − kr2 −b(r)/r)+ r2dΩ^2 in the context of General Relativity. This task is achieved by deriving the Einstein field equations for anisotropic matter field corresponding to the considered cosmological wormhole metric and performing a classification of their solutions on the basis of a variable equations of state (EoS) of the form p = ω(r)ρ. Explicit forms of the shape function b(r) and the scale factor a(t) arising in the classification are utilized to construct the corresponding energy-momentum tensor where the energy conditions for each case is investigated. While the violation of energy conditions is inevitable in case of static wormholes, the classification we performed leads to interesting solutions in which this violation is either reduced or eliminated.Keywords: general relativity, Einstein field equations, energy conditions, cosmological wormhole
Procedia PDF Downloads 632085 Reducing Later Life Loneliness: A Systematic Literature Review of Loneliness Interventions
Authors: Dhruv Sharma, Lynne Blair, Stephen Clune
Abstract:
Later life loneliness is a social issue that is increasing alongside an upward global population trend. As a society, one way that we have responded to this social challenge is through developing non-pharmacological interventions such as befriending services, activity clubs, meet-ups, etc. Through a systematic literature review, this paper suggests that currently there is an underrepresentation of radical innovation, and underutilization of digital technologies in developing loneliness interventions for older adults. This paper examines intervention studies that were published in English language, within peer reviewed journals between January 2005 and December 2014 across 4 electronic databases. In addition to academic databases, interventions found in grey literature in the form of websites, blogs, and Twitter were also included in the overall review. This approach yielded 129 interventions that were included in the study. A systematic approach allowed the minimization of any bias dictating the selection of interventions to study. A coding strategy based on a pattern analysis approach was devised to be able to compare and contrast the loneliness interventions. Firstly, interventions were categorized on the basis of their objective to identify whether they were preventative, supportive, or remedial in nature. Secondly, depending on their scope, they were categorized as one-to-one, community-based, or group based. It was also ascertained whether interventions represented an improvement, an incremental innovation, a major advance or a radical departure, in comparison to the most basic form of a loneliness intervention. Finally, interventions were also assessed on the basis of the extent to which they utilized digital technologies. Individual visualizations representing the four levels of coding were created for each intervention, followed by an aggregated visual to facilitate analysis. To keep the inquiry within scope and to present a coherent view of the findings, the analysis was primarily concerned the level of innovation, and the use of digital technologies. This analysis highlights a weak but positive correlation between the level of innovation and the use of digital technologies in designing and deploying loneliness interventions, and also emphasizes how certain existing interventions could be tweaked to enable their migration from representing incremental innovation to radical innovation for example. This analysis also points out the value of including grey literature, especially from Twitter, in systematic literature reviews to get a contemporary view of latest work in the area under investigation.Keywords: ageing, loneliness, innovation, digital
Procedia PDF Downloads 1232084 Optimization Based Extreme Learning Machine for Watermarking of an Image in DWT Domain
Authors: RAM PAL SINGH, VIKASH CHAUDHARY, MONIKA VERMA
Abstract:
In this paper, we proposed the implementation of optimization based Extreme Learning Machine (ELM) for watermarking of B-channel of color image in discrete wavelet transform (DWT) domain. ELM, a regularization algorithm, works based on generalized single-hidden-layer feed-forward neural networks (SLFNs). However, hidden layer parameters, generally called feature mapping in context of ELM need not to be tuned every time. This paper shows the embedding and extraction processes of watermark with the help of ELM and results are compared with already used machine learning models for watermarking.Here, a cover image is divide into suitable numbers of non-overlapping blocks of required size and DWT is applied to each block to be transformed in low frequency sub-band domain. Basically, ELM gives a unified leaning platform with a feature mapping, that is, mapping between hidden layer and output layer of SLFNs, is tried for watermark embedding and extraction purpose in a cover image. Although ELM has widespread application right from binary classification, multiclass classification to regression and function estimation etc. Unlike SVM based algorithm which achieve suboptimal solution with high computational complexity, ELM can provide better generalization performance results with very small complexity. Efficacy of optimization method based ELM algorithm is measured by using quantitative and qualitative parameters on a watermarked image even though image is subjected to different types of geometrical and conventional attacks.Keywords: BER, DWT, extreme leaning machine (ELM), PSNR
Procedia PDF Downloads 3132083 Developing the Collaboration Model of Physical Education and Sport Sciences Faculties with Service Section of Sport Industrial
Authors: Vahid Saatchian, Seyyed Farideh Hadavi
Abstract:
The main aim of this study was developing the collaboration model of physical education and sport sciences faculties with service section of sport industrial.The research methods of this study was a qualitative. So researcher with of identifying the priority list of collaboration between colleges and service section of sport industry and according to sampling based of subjective and snowball approach, conducted deep interviews with 22 elites that study around the field of research topic. indeed interviews were analyzed through qualitative coding (open, axial and selective) with 5 category such as causal condition, basic condition, intervening conditions, action/ interaction and strategy. Findings exposed that in causal condition 10 labels appeared. So because of heterogeneity of labes, researcher categorized in total subject. In basic condition 59 labels in open coding identified this categorized in 14 general concepts. Furthermore with composition of the declared category and relationship between them, 5 final and internal categories (culture, intelligence, marketing, environment and ultra-powers) were appeared. Also an intervening condition in the study includes 5 overall scopes of social factors, economic, cultural factors, and the management of the legal and political factors that totally named macro environment. Indeed for identifying strategies, 8 areas that covered with internal and external challenges relationship management were appeared. These are including, understanding, outside awareness, manpower, culture, integrated management, the rules and regulations and marketing. Findings exposed 8 labels in open coding which covered the internal and external of challenges of relation management of two sides and these concepts were knowledge and awareness, external view, human source, madding organizational culture, parties’ thoughts, unit responsible for/integrated management, laws and regulations and marketing. Eventually the consequences categorized in line of strategies and were at scope of the cultural development, general development, educational development, scientific development, under development, international development, social development, economic development, technology development and political development that consistent with strategies. The research findings could help the sport managers witch use to scientific collaboration management and the consequences of this in those sport institutions. Finally, the consequences that identified as a result of the devopmental strategies include: cultural, governmental, educational, scientific, infrastructure, international, social, economic, technological and political that is largely consistent with strategies. With regard to the above results, enduring and systematic relation with long term cooperation between the two sides requires strategic planning were based on cooperation of all stakeholders. Through this, in the turbulent constantly changing current sustainable environment, competitive advantage for university and industry obtained. No doubt that lack of vision and strategic thinking for cooperation in the planning of the university and industry from its capability and instead of using the opportunity, lead the opportunities to problems.Keywords: university and industry collaboration, sport industry, physical education and sport science college, service section of sport industry
Procedia PDF Downloads 3822082 Maintenance Optimization for a Multi-Component System Using Factored Partially Observable Markov Decision Processes
Authors: Ipek Kivanc, Demet Ozgur-Unluakin
Abstract:
Over the past years, technological innovations and advancements have played an important role in the industrial world. Due to technological improvements, the degree of complexity of the systems has increased. Hence, all systems are getting more uncertain that emerges from increased complexity, resulting in more cost. It is challenging to cope with this situation. So, implementing efficient planning of maintenance activities in such systems are getting more essential. Partially Observable Markov Decision Processes (POMDPs) are powerful tools for stochastic sequential decision problems under uncertainty. Although maintenance optimization in a dynamic environment can be modeled as such a sequential decision problem, POMDPs are not widely used for tackling maintenance problems. However, they can be well-suited frameworks for obtaining optimal maintenance policies. In the classical representation of the POMDP framework, the system is denoted by a single node which has multiple states. The main drawback of this classical approach is that the state space grows exponentially with the number of state variables. On the other side, factored representation of POMDPs enables to simplify the complexity of the states by taking advantage of the factored structure already available in the nature of the problem. The main idea of factored POMDPs is that they can be compactly modeled through dynamic Bayesian networks (DBNs), which are graphical representations for stochastic processes, by exploiting the structure of this representation. This study aims to demonstrate how maintenance planning of dynamic systems can be modeled with factored POMDPs. An empirical maintenance planning problem of a dynamic system consisting of four partially observable components deteriorating in time is designed. To solve the empirical model, we resort to Symbolic Perseus solver which is one of the state-of-the-art factored POMDP solvers enabling approximate solutions. We generate some more predefined policies based on corrective or proactive maintenance strategies. We execute the policies on the empirical problem for many replications and compare their performances under various scenarios. The results show that the computed policies from the POMDP model are superior to the others. Acknowledgment: This work is supported by the Scientific and Technological Research Council of Turkey (TÜBİTAK) under grant no: 117M587.Keywords: factored representation, maintenance, multi-component system, partially observable Markov decision processes
Procedia PDF Downloads 1362081 Synthesis and Characterization of pH-Responsive Nanocarriers Based on POEOMA-b-PDPA Block Copolymers for RNA Delivery
Authors: Bruno Baptista, Andreia S. R. Oliveira, Patricia V. Mendonca, Jorge F. J. Coelho, Fani Sousa
Abstract:
Drug delivery systems are designed to allow adequate protection and controlled delivery of drugs to specific locations. These systems aim to reduce side effects and control the biodistribution profile of drugs, thus improving therapeutic efficacy. This study involved the synthesis of polymeric nanoparticles, based on amphiphilic diblock copolymers, comprising a biocompatible, poly (oligo (ethylene oxide) methyl ether methacrylate (POEOMA) as hydrophilic segment and a pH-sensitive block, the poly (2-diisopropylamino)ethyl methacrylate) (PDPA). The objective of this work was the development of polymeric pH-responsive nanoparticles to encapsulate and carry small RNAs as a model to further develop non-coding RNAs delivery systems with therapeutic value. The responsiveness of PDPA to pH allows the electrostatic interaction of these copolymers with nucleic acids at acidic pH, as a result of the protonation of the tertiary amine groups of this polymer at pH values below its pKa (around 6.2). Initially, the molecular weight parameters and chemical structure of the block copolymers were determined by size exclusion chromatography (SEC) and nuclear magnetic resonance (1H-NMR) spectroscopy, respectively. Then, the complexation with small RNAs was verified, generating polyplexes with sizes ranging from 300 to 600 nm and with encapsulation efficiencies around 80%, depending on the molecular weight of the polymers, their composition, and concentration used. The effect of pH on the morphology of nanoparticles was evaluated by scanning electron microscopy (SEM) being verified that at higher pH values, particles tend to lose their spherical shape. Since this work aims to develop systems for the delivery of non-coding RNAs, studies on RNA protection (contact with RNase, FBS, and Trypsin) and cell viability were also carried out. It was found that they induce some protection against constituents of the cellular environment and have no cellular toxicity. In summary, this research work contributes to the development of pH-sensitive polymers, capable of protecting and encapsulating RNA, in a relatively simple and efficient manner, to further be applied on drug delivery to specific sites where pH may have a critical role, as it can occur in several cancer environments.Keywords: drug delivery systems, pH-responsive polymers, POEOMA-b-PDPA, small RNAs
Procedia PDF Downloads 2592080 Community Perceptions and Attitudes Regarding Wildlife Crime in South Africa
Authors: Louiza C. Duncker, Duarte Gonçalves
Abstract:
Wildlife crime is a complex problem with many interconnected facets, which are generally responded to in parts or fragments in efforts to “break down” the complexity into manageable components. However, fragmentation increases complexity as coherence and cooperation become diluted. A whole-of-society approach has been developed towards finding a common goal and integrated approach to preventing wildlife crime. As part of this development, research was conducted in rural communities adjacent to conservation areas in South Africa to define and comprehend the challenges faced by them, and to understand their perceptions of wildlife crime. The results of the research showed that the perceptions of community members varied - most were in favor of conservation and of protecting rhinos, only if they derive adequate benefit from it. Regardless of gender, income level, education level, or access to services, conservation was perceived to be good and bad by the same people. Even though people in the communities are poor, a willingness to stop rhino poaching does exist amongst them, but their perception of parks not caring about people triggered an attitude of not being willing to stop, prevent or report poaching. Understanding the nuances, the history, the interests and values of community members, and the drivers behind poaching mind-sets (intrinsic or driven by transnational organized crime) is imperative to create sustainable and resilient communities on multiple levels that make a substantial positive impact on people’s lives, but also conserve wildlife for posterity.Keywords: community perceptions, conservation, rhino poaching, whole-of-society approach, wildlife crime
Procedia PDF Downloads 2382079 The Friction of Oil Contaminated Granular Soils; Experimental Study
Authors: Miron A., Tadmor R., Pinkert S.
Abstract:
Soil contamination is a pressing environmental concern, drawing considerable focus due to its adverse ecological and health outcomes, and the frequent occurrence of contamination incidents in recent years. The interaction between the oil pollutant and the host soil can alter the mechanical properties of the soil in a manner that can crucially affect engineering challenges associated with the stability of soil systems. The geotechnical investigation of contaminated soils has gained momentum since the Gulf War in the 1990s, when a massive amount of oil was spilled into the ocean. Over recent years, various types of soil contaminations have been studied to understand the impact of pollution type, uncovering the mechanical complexity that arises not just from the pollutant type but also from the properties of the host soil and the interplay between them. This complexity is associated with diametrically opposite effects in different soil types. For instance, while certain oils may enhance the frictional properties of cohesive soils, they can reduce the friction in granular soils. This striking difference can be attributed to the different mechanisms at play: physico-chemical interactions predominate in the former case, whereas lubrication effects are more significant in the latter. this study introduces an empirical law designed to quantify the mechanical effect of oil contamination in granular soils, factoring the properties of both the contaminating oil and the host soil. This law is achieved by comprehensive experimental research that spans a wide array of oil types and soils with unique configurations and morphologies. By integrating these diverse data points, our law facilitates accurate predictions of how oil contamination modifies the frictional characteristics of general granular soils.Keywords: contaminated soils, lubrication, friction, granular media
Procedia PDF Downloads 552078 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal
Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert
Abstract:
We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.Keywords: computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving
Procedia PDF Downloads 3672077 Patient-Specific Modeling Algorithm for Medical Data Based on AUC
Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper
Abstract:
Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.Keywords: approach instance-based, area under the ROC curve, patient-specific decision path, clinical predictions
Procedia PDF Downloads 4792076 Benthic Cover in Coral Reef Environments under Influence of Submarine Groundwater Discharges
Authors: Arlett A. Rosado-Torres, Ismael Marino-Tapia
Abstract:
Changes in benthic cover of coral dominated systems to macroalgae dominance are widely studied worldwide. Watershed pollutants are potentially as important as overfishing causing phase shift. In certain regions of the world most of the continental inputs are through submarine groundwater discharges (SGD), which can play a significant ecological role because the concentration of its nutrients is usually greater that the one found in surface seawater. These stressors have adversely affected coral reefs, particularly in the Caribbean. Measurements of benthic cover (with video tracing, through a Go Pro camera), reef roughness (acoustic estimates with an Acoustic Doppler Current Velocity profiler and a differential GPS), thermohaline conditions (conductivity-temperature-depth (CTD) instrument) and nutrient measurements were taken in different sites in the reef lagoon of Puerto Morelos, Q. Roo, Mexico including those with influence of SGD and without it. The results suggest a link between SGD, macroalgae cover and structural complexity. Punctual water samples and data series from a CTD Diver confirm the presence of the SGD. On the site where the SGD is, the macroalgae cover is larger than in the other sites. To establish a causal link between this phase shift and SGD, the DELFT 3D hydrodynamic model (FLOW and WAVE modules) was performed under different environmental conditions and discharge magnitudes. The model was validated using measurements of oceanographic instruments anchored in the lagoon and forereef. The SGD is consistently favoring macroalgae populations and affecting structural complexity of the reef.Keywords: hydrodynamic model, macroalgae, nutrients, phase shift
Procedia PDF Downloads 1532075 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases
Authors: Mohammad A. Bani-Khaled
Abstract:
In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams
Procedia PDF Downloads 4202074 Sexualization of Women in Nigerian Magazine Advertisements
Authors: Kehinde Augustina Odukoya
Abstract:
This study examines the portrayal of women in Nigerian magazine advertisements, with the aim to investigate whether there is sexualization of women in the advertisements. To achieve this aim, content analyses of 61 magazine advertisements from 5 different categories of magazines; a general interest magazine (Genevieve), fashion magazine (Hints Complete Fashion), men’s magazine (Mode), women’s magazine (Totally Whole) and a relationship magazine (Forever) were carried out. Erving Goffman’s 1979 frame analysis and Kang’s two additional coding categories were used to investigate the sexualization of women. Findings show that women are used for decorative purposes and objectified in over 70 per cent of the advertisements analyzed. Also, there is sexualization of women in magazine advertisements because women are nude 57.4 percent of the magazine advertisements.Keywords: advertisements, magazine, sexualization, women
Procedia PDF Downloads 365