Search results for: coding complexity metric mccabe
2192 The Regulation of the Cancer Epigenetic Landscape Lies in the Realm of the Long Non-coding RNAs
Authors: Ricardo Alberto Chiong Zevallos, Eduardo Moraes Rego Reis
Abstract:
Pancreatic adenocarcinoma (PDAC) patients have a less than 10% 5-year survival rate. PDAC has no defined diagnostic and prognostic biomarkers. Gemcitabine is the first-line drug in PDAC and several other cancers. Long non-coding RNAs (lncRNAs) contribute to the tumorigenesis and are potential biomarkers for PDAC. Although lncRNAs aren’t translated into proteins, they have important functions. LncRNAs can decoy or recruit proteins from the epigenetic machinery, act as microRNA sponges, participate in protein translocation through different cellular compartments, and even promote chemoresistance. The chromatin remodeling enzyme EZH2 is a histone methyltransferase that catalyzes the methylation of histone 3 at lysine 27, silencing local expression. EZH2 is ambivalent, it can also activate gene expression independently of its histone methyltransferase activity. EZH2 is overexpressed in several cancers and interacts with lncRNAs, being recruited to a specific locus. EZH2 can be recruited to activate an oncogene or silence a tumor suppressor. The lncRNAs misregulation in cancer can result in the differential recruitment of EZH2 and in a distinct epigenetic landscape, promoting chemoresistance. The relevance of the EZH2-lncRNAs interaction to chemoresistant PDAC was assessed by Real Time quantitative PCR (RT-qPCR) and RNA Immunoprecipitation (RIP) experiments with naïve and gemcitabine-resistant PDAC cells. The expression of several lncRNAs and EZH2 gene targets was evaluated contrasting naïve and resistant cells. Selection of candidate genes was made by bioinformatic analysis and literature curation. Indeed, the resistant cell line showed higher expression of chemoresistant-associated lncRNAs and protein coding genes. RIP detected lncRNAs interacting with EZH2 with varying intensity levels in the cell lines. During RIP, the nuclear fraction of the cells was incubated with an antibody for EZH2 and with magnetic beads. The RNA precipitated with the beads-antibody-EZH2 complex was isolated and reverse transcribed. The presence of candidate lncRNAs was detected by RT-qPCR, and the enrichment was calculated relative to INPUT (total lysate control sample collected before RIP). The enrichment levels varied across the several lncRNAs and cell lines. The EZH2-lncRNA interaction might be responsible for the regulation of chemoresistance-associated genes in multiple cancers. The relevance of the lncRNA-EZH2 interaction to PDAC was assessed by siRNA knockdown of a lncRNA, followed by the analysis of the EZH2 target expression by RT-qPCR. The chromatin immunoprecipitation (ChIP) of EZH2 and H3K27me3 followed by RT-qPCR with primers for EZH2 targets also assess the specificity of the EZH2 recruitment by the lncRNA. This is the first report of the interaction of EZH2 and lncRNAs HOTTIP and PVT1 in chemoresistant PDAC. HOTTIP and PVT1 were described as promoting chemoresistance in several cancers, but the role of EZH2 is not clarified. For the first time, the lncRNA LINC01133 was detected in a chemoresistant cancer. The interaction of EZH2 with LINC02577, LINC00920, LINC00941, and LINC01559 have never been reported in any context. The novel lncRNAs-EZH2 interactions regulate chemoresistant-associated genes in PDAC and might be relevant to other cancers. Therapies targeting EZH2 alone weren’t successful, and a combinatorial approach also targeting the lncRNAs interacting with it might be key to overcome chemoresistance in several cancers.Keywords: epigenetics, chemoresistance, long non-coding RNAs, pancreatic cancer, histone modification
Procedia PDF Downloads 962191 Generalized Chaplygin Gas and Varying Bulk Viscosity in Lyra Geometry
Authors: A. K. Sethi, R. N. Patra, B. Nayak
Abstract:
In this paper, we have considered Friedmann-Robertson-Walker (FRW) metric with generalized Chaplygin gas which has viscosity in the context of Lyra geometry. The viscosity is considered in two different ways (i.e. zero viscosity, non-constant r (rho)-dependent bulk viscosity) using constant deceleration parameter which concluded that, for a special case, the viscous generalized Chaplygin gas reduces to modified Chaplygin gas. The represented model indicates on the presence of Chaplygin gas in the Universe. Observational constraints are applied and discussed on the physical and geometrical nature of the Universe.Keywords: bulk viscosity, lyra geometry, generalized chaplygin gas, cosmology
Procedia PDF Downloads 1772190 Variations in Spatial Learning and Memory across Natural Populations of Zebrafish, Danio rerio
Authors: Tamal Roy, Anuradha Bhat
Abstract:
Cognitive abilities aid fishes in foraging, avoiding predators & locating mates. Factors like predation pressure & habitat complexity govern learning & memory in fishes. This study aims to compare spatial learning & memory across four natural populations of zebrafish. Zebrafish, a small cyprinid inhabits a diverse range of freshwater habitats & this makes it amenable to studies investigating role of native environment in spatial cognitive abilities. Four populations were collected across India from waterbodies with contrasting ecological conditions. Habitat complexity of the water-bodies was evaluated as a combination of channel substrate diversity and diversity of vegetation. Experiments were conducted on populations under controlled laboratory conditions. A square shaped spatial testing arena (maze) was constructed for testing the performance of adult zebrafish. The square tank consisted of an inner square shaped layer with the edges connected to the diagonal ends of the tank-walls by connections thereby forming four separate chambers. Each of the four chambers had a main door in the centre. Each chamber had three sections separated by two windows. A removable coloured window-pane (red, yellow, green or blue) identified each main door. A food reward associated with an artificial plant was always placed inside the left-hand section of the red-door chamber. The position of food-reward and plant within the red-door chamber was fixed. A test fish would have to explore the maze by taking turns and locate the food inside the right-side section of the red-door chamber. Fishes were sorted from each population stock and kept individually in separate containers for identification. At a time, a test fish was released into the arena and allowed 20 minutes to explore in order to find the food-reward. In this way, individual fishes were trained through the maze to locate the food reward for eight consecutive days. The position of red door, with the plant and the reward, was shuffled every day. Following training, an intermission of four days was given during which the fishes were not subjected to trials. Post-intermission, the fishes were re-tested on the 13th day following the same protocol for their ability to remember the learnt task. Exploratory tendencies and latency of individuals to explore on 1st day of training, performance time across trials, and number of mistakes made each day were recorded. Additionally, mechanism used by individuals to solve the maze each day was analyzed across populations. Fishes could be expected to use algorithm (sequence of turns) or associative cues in locating the food reward. Individuals of populations did not differ significantly in latencies and tendencies to explore. No relationship was found between exploration and learning across populations. High habitat-complexity populations had higher rates of learning & stronger memory while low habitat-complexity populations had lower rates of learning and much reduced abilities to remember. High habitat-complexity populations used associative cues more than algorithm for learning and remembering while low habitat-complexity populations used both equally. The study, therefore, helped understand the role of natural ecology in explaining variations in spatial learning abilities across populations.Keywords: algorithm, associative cue, habitat complexity, population, spatial learning
Procedia PDF Downloads 2902189 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection
Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye
Abstract:
Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.Keywords: connected-component, projection-profile, segmentation, text-line
Procedia PDF Downloads 1242188 Impact of Two Xenobiotics in Mosquitofish: Gambusia affinis: Several Approaches
Authors: Chouahda Salima, Soltani Noureddine
Abstract:
The present study is a part of biological control against mosquitoes. It aims to assess the impact of two xenobiotics (a selective insect growth regulator: halofenozide and heavy metals: cadmium, more toxic and widespread in the region) in mosquitofish: Gambusia affinis. Several approaches were examined: Acute toxicity of cadmium and halofenozide: The acute toxicity of cadmium and halofenozide was examined in juvenile and adult males and females of G. affinis at different concentrations, cadmium causes mortality of the species studied with a relation dose-response. In laboratory conditions, the impact of cadmium was determined on two biomarkers of environmental stress: glutathione and acetylcholinesterase. The results show that the juvenile followed by adult males are more susceptible than adult females, while the halofenozide does not have any effect on the mortality of juvenile and adult males and females of G.affinis. Chronic toxicity of cadmium and halofenozide: both xenobiotics were added to the water fish raising at different doses tested in juveniles and adults males and females during two months of experience. Growth and metric indices; results show that halofenozide added to the water juveniles of G. affinis has no effect on their growth (length and weight). On the other side, the cadmium at the dose 5 µg/L shows a higher toxicity against juvenile, where he appears to reduce significantly their linear growth and weight. In females, the both xenobiotics have significant effects on metric indices, but these effects are more important on the hepatosomatic index that the gonadosomatic index and the coefficient of condition. Biomarkers; acetylcholinesterase (AChE), glutathione S-transferase (GST) and glutathione (GSH) used in assessing of environmental stress were measured in juveniles and adults males and females. The response of these biomarkers reveals an inhibition of AChE specific activity, an induction of GST activity, and decrease of GSH rates in juveniles in the end of experiment and during chronic treatment adult males and females. The effect of these biomarkers is more pronounced in females compared to males and juveniles. These different biomarkers have a similar profile for the duration of exposure.Keywords: gambusia affinis, insecticide, heavy metal, morphology, biomarkers, chronic toxicity, acute toxicity, pollution
Procedia PDF Downloads 3142187 Accountability of Artificial Intelligence: An Analysis Using Edgar Morin’s Complex Thought
Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan
Abstract:
Artificial intelligence (AI) can be held accountable for its detrimental impacts. This question gains heightened relevance given AI's pervasive reach across various domains, magnifying its power and potential. The expanding influence of AI raises fundamental ethical inquiries, primarily centering on biases, responsibility, and transparency. This encompasses discriminatory biases arising from algorithmic criteria or data, accidents attributed to autonomous vehicles or other systems, and the imperative of transparent decision-making. This article aims to stimulate reflection on AI accountability, denoting the necessity to elucidate the effects it generates. Accountability comprises two integral aspects: adherence to legal and ethical standards and the imperative to elucidate the underlying operational rationale. The objective is to initiate a reflection on the obstacles to this "accountability," facing the challenges of the complexity of artificial intelligence's system and its effects. Then, this article proposes to mobilize Edgar Morin's complex thought to encompass and face the challenges of this complexity. The first contribution is to point out the challenges posed by the complexity of A.I., with fractional accountability between a myriad of human and non-human actors, such as software and equipment, which ultimately contribute to the decisions taken and are multiplied in the case of AI. Accountability faces three challenges resulting from the complexity of the ethical issues combined with the complexity of AI. The challenge of the non-neutrality of algorithmic systems as fully ethically non-neutral actors is put forward by a revealing ethics approach that calls for assigning responsibilities to these systems. The challenge of the dilution of responsibility is induced by the multiplicity and distancing between the actors. Thus, a dilution of responsibility is induced by a split in decision-making between developers, who feel they fulfill their duty by strictly respecting the requests they receive, and management, which does not consider itself responsible for technology-related flaws. Accountability is confronted with the challenge of transparency of complex and scalable algorithmic systems, non-human actors self-learning via big data. A second contribution involves leveraging E. Morin's principles, providing a framework to grasp the multifaceted ethical dilemmas and subsequently paving the way for establishing accountability in AI. When addressing the ethical challenge of biases, the "hologrammatic" principle underscores the imperative of acknowledging the non-ethical neutrality of algorithmic systems inherently imbued with the values and biases of their creators and society. The "dialogic" principle advocates for the responsible consideration of ethical dilemmas, encouraging the integration of complementary and contradictory elements in solutions from the very inception of the design phase. Aligning with the principle of organizing recursiveness, akin to the "transparency" of the system, it promotes a systemic analysis to account for the induced effects and guides the incorporation of modifications into the system to rectify deviations and reintroduce modifications into the system to rectify its drifts. In conclusion, this contribution serves as an inception for contemplating the accountability of "artificial intelligence" systems despite the evident ethical implications and potential deviations. Edgar Morin's principles, providing a lens to contemplate this complexity, offer valuable perspectives to address these challenges concerning accountability.Keywords: accountability, artificial intelligence, complexity, ethics, explainability, transparency, Edgar Morin
Procedia PDF Downloads 632186 Educational Leadership Preparation Program Review of Employer Satisfaction
Authors: Glenn Koonce
Abstract:
There is a need to address the improvement of university educational leadership preparation programs through the processes of accreditation and continuous improvement. The program faculty in a university in the eastern part of the United States has incorporated an employer satisfaction focus group to address their national accreditation standard so that employers are satisfied with completers' preparation for the position of principal or assistant principal. Using the Council for the Accreditation of Educator Preparation (CAEP) required proficiencies, the following research questions are investigated: 1) what proficiencies do completers perform the strongest? 2) what proficiencies need to be strengthened? 3) what other strengths beyond the required proficiencies do completers demonstrate? 4) what other areas of responsibility beyond the required proficiencies do completers demonstrate? and 5) how can the program improve in preparing candidates for their positions? This study focuses on employers of one public school district that has a large number of educational leadership completers employed as principals and assistant principals. Central office directors who evaluate principals and principals who evaluate assistant principals are focus group participants. Construction of the focus group questions is a result of recommendations from an accreditation regulatory specialist, reviewed by an expert panel, and piloted by an experienced focus group leader. The focus group session was audio recorded, transcribed, and analyzed using the NVivo Version 14 software. After constructing folders in NVivo, the focus group transcript was loaded and skimmed by diagnosing significant statements and assessing core ideas for developing primary themes. These themes were aligned to address the research questions. From the transcript, codes were assigned to the themes and NVivo provided a coding hierarchy chart or graphical illustration for framing the coding. A final report of the coding process was designed using the primary themes and pertinent codes that were supported in excerpts from the transcript. The outcome of this study is to identify themes that can provide evidence that the educational leadership program is meeting its mission to improve PreK-12 student achievement through well-prepared completers who have achieved the position of principal or assistant principal. The considerations will be used to derive a composite profile of employers' satisfaction with program completers with the capacity to serve, influence, and thrive as educational leaders. Analysis of the idealized themes will result in identifying issues that may challenge university educational leadership programs to improve. Results, conclusions, and recommendations are used for continuous improvement, which is another national accreditation standard required for the program.Keywords: educational leadership preparation, CAEP accreditation, principal & assistant principal evaluations, continuous improvement
Procedia PDF Downloads 302185 From Mobility to Complexity: French Language Use among Algerian Doctoral Postgraduates in Scotland
Authors: Hadjer Chellia
Abstract:
The study explores the phenomenon of second language use in a migratory setting and uses the case of Algerian international students in Scotland, United Kingdom. The linguistic history of Algeria reveals that French language has a high status among the Algerians’ verbal repertoires and Algerian English students consider it as a language of prestige. With mobility of some of these students towards Scotland -in the guise of internationalization of higher education, mobility and exchange programs, the transition was deemed to bring more complexity to their pre-migratory linguistic repertoires and resulted into their French language- being endangered and threatened by a potential shift to English. The study employed semi-structured interviews among six Ph.D. ethnically related students, and the main aim behind that is to explore their current experiences with regards to French language use and to provide an account of the factors which assist in shifting to English as a second language instead. The six participants identified in interviews were further invited to focus group sessions based on an in-group interaction fashion to discuss different topics using heritage languages. This latter was opted for as part of the methodology as a means to observe their real linguistic practice and to investigate the link between behaviors and previous perceptions. The findings detect a variety of social, individual and socio-psychological factors that would contribute in refining the concept of language shift among newly established émigré communities with short stay vis a vis the linguistic outcomes of immigrants with long stay, across generational basis that was –to some extent-the focus of previous research on language shift. The results further reveal a mismatch between students' perceptions and observed behaviors. The research is then largely relevant to international students’ sociolinguistic experience of study abroad.Keywords: complexity, mobility, potential shift, sociolinguistic experience
Procedia PDF Downloads 1672184 Complex Decision Rules in the Form of Decision Trees
Authors: Avinash S. Jagtap, Sharad D. Gore, Rajendra G. Gurao
Abstract:
Decision rules become more and more complex as the number of conditions increase. As a consequence, the complexity of the decision rule also influences the time complexity of computer implementation of such a rule. Consider, for example, a decision that depends on four conditions A, B, C and D. For simplicity, suppose each of these four conditions is binary. Even then the decision rule will consist of 16 lines, where each line will be of the form: If A and B and C and D, then action 1. If A and B and C but not D, then action 2 and so on. While executing this decision rule, each of the four conditions will be checked every time until all the four conditions in a line are satisfied. The minimum number of logical comparisons is 4 whereas the maximum number is 64. This paper proposes to present a complex decision rule in the form of a decision tree. A decision tree divides the cases into branches every time a condition is checked. In the form of a decision tree, every branching eliminates half of the cases that do not satisfy the related conditions. As a result, every branch of the decision tree involves only four logical comparisons and hence is significantly simpler than the corresponding complex decision rule. The conclusion of this paper is that every complex decision rule can be represented as a decision tree and the decision tree is mathematically equivalent but computationally much simpler than the original complex decision ruleKeywords: strategic, tactical, operational, adaptive, innovative
Procedia PDF Downloads 2882183 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook
Authors: Chien-Jen Liu, Shu Ching Yang
Abstract:
Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.Keywords: technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness
Procedia PDF Downloads 3462182 Optimizing the Capacity of a Convolutional Neural Network for Image Segmentation and Pattern Recognition
Authors: Yalong Jiang, Zheru Chi
Abstract:
In this paper, we study the factors which determine the capacity of a Convolutional Neural Network (CNN) model and propose the ways to evaluate and adjust the capacity of a CNN model for best matching to a specific pattern recognition task. Firstly, a scheme is proposed to adjust the number of independent functional units within a CNN model to make it be better fitted to a task. Secondly, the number of independent functional units in the capsule network is adjusted to fit it to the training dataset. Thirdly, a method based on Bayesian GAN is proposed to enrich the variances in the current dataset to increase its complexity. Experimental results on the PASCAL VOC 2010 Person Part dataset and the MNIST dataset show that, in both conventional CNN models and capsule networks, the number of independent functional units is an important factor that determines the capacity of a network model. By adjusting the number of functional units, the capacity of a model can better match the complexity of a dataset.Keywords: CNN, convolutional neural network, capsule network, capacity optimization, character recognition, data augmentation, semantic segmentation
Procedia PDF Downloads 1552181 Multi-Sensory Coding as Intervention Therapy for ESL Spellers with Auditory Processing Delays: A South African Case-Study
Authors: A. Van Staden, N. Purcell
Abstract:
Spelling development is complex and multifaceted and relies on several cognitive-linguistic processes. This paper explored the spelling difficulties of English second language learners with auditory processing delays. This empirical study aims to address these issues by means of an intervention design. Specifically, the objectives are: (a) to develop and implement a multi-sensory spelling program for second language learners with auditory processing difficulties (APD) for a period of 6 months; (b) to assess the efficacy of the multi-sensory spelling program and whether this intervention could significantly improve experimental learners' spelling, phonological awareness, and processing (PA), rapid automatized naming (RAN), working memory (WM), word reading and reading comprehension; and (c) to determine the relationship (or interplay) between these cognitive and linguistic skills (mentioned above), and how they influence spelling development. Forty-four English, second language learners with APD were sampled from one primary school in the Free State province. The learners were randomly assigned to either an experimental (n=22) or control group (n=22). During the implementation of the spelling program, several visual, tactile and kinesthetic exercises, including the utilization of fingerspelling were introduced to support the experimental learners’ (N = 22) spelling development. Post-test results showed the efficacy of the multi-sensory spelling program, with the experimental group who were trained in utilising multi-sensory coding and fingerspelling outperforming learners from the control group on the cognitive-linguistic, spelling and reading measures. The results and efficacy of this multi-sensory spelling program and the utilisation of fingerspelling for hearing second language learners with APD open up innovative perspectives for the prevention and targeted remediation of spelling difficulties.Keywords: English second language spellers, auditory processing delays, spelling difficulties, multi-sensory intervention program
Procedia PDF Downloads 1372180 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes
Authors: Igor A. Krichtafovitch
Abstract:
The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.Keywords: supercomputer, biological evolution, Darwinism, speciation
Procedia PDF Downloads 1662179 Environment Problems of Energy Exploitation and Utilization in Nigeria
Authors: Aliyu Mohammed Lawal
Abstract:
The problems placed on the environment as a result of energy generation and usage in Nigeria is: potential damage to the environment health by CO, CO2, SOx, and NOx, effluent gas emissions and global warming. For instance in the year 2004 in Nigeria energy consumption was 58% oil and 34% natural gas but about 94 million metric tons of CO2 was emitted out of which 64% came from fossil fuels while about 35% came from fuel wood. The findings from this research on how to alleviate these problems are that long term sustainable development solutions should be enhanced globally; energy should be used more rationally renewable energy resources should be exploited and the existing emissions should be controlled to tolerate limits because the increase in energy demand in Nigeria places enormous strain on current energy facilities.Keywords: effluent gas, emissions, NOx, SOx
Procedia PDF Downloads 3822178 Examining the Effects of College Education on Democratic Attitudes in China: A Regression Discontinuity Analysis
Authors: Gang Wang
Abstract:
Education is widely believed to be a prerequisite for democracy and civil society, but the causal link between education and outcome variables is usually hardly to be identified. This study applies a fuzzy regression discontinuity design to examine the effects of college education on democratic attitudes in the Chinese context. In the analysis treatment assignment is determined by students’ college entry years and thus naturally selected by subjects’ ages. Using a sample of Chinese college students collected in Beijing in 2009, this study finds that college education actually reduces undergraduates’ motivation for political development in China but promotes political loyalty to the authoritarian government. Further hypotheses tests explain these interesting findings from two perspectives. The first is related to the complexity of politics. As college students progress over time, they increasingly realize the complexity of political reform in China’s authoritarian regime and rather stay away from politics. The second is related to students’ career opportunities. As students are close to graduation, they are immersed with job hunting and have a reduced interest in political freedom.Keywords: china, college education, democratic attitudes, regression discontinuity
Procedia PDF Downloads 3512177 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information
Authors: A. Preetha Priyadharshini, S. B. M. Priya
Abstract:
In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information
Procedia PDF Downloads 8142176 Capturing Public Voices: The Role of Social Media in Heritage Management
Authors: Mahda Foroughi, Bruno de Anderade, Ana Pereira Roders
Abstract:
Social media platforms have been increasingly used by locals and tourists to express their opinions about buildings, cities, and built heritage in particular. Most recently, scholars have been using social media to conduct innovative research on built heritage and heritage management. Still, the application of artificial intelligence (AI) methods to analyze social media data for heritage management is seldom explored. This paper investigates the potential of short texts (sentences and hashtags) shared through social media as a data source and artificial intelligence methods for data analysis for revealing the cultural significance (values and attributes) of built heritage. The city of Yazd, Iran, was taken as a case study, with a particular focus on windcatchers, key attributes conveying outstanding universal values, as inscribed on the UNESCO World Heritage List. This paper has three subsequent phases: 1) state of the art on the intersection of public participation in heritage management and social media research; 2) methodology of data collection and data analysis related to coding people's voices from Instagram and Twitter into values of windcatchers over the last ten-years; 3) preliminary findings on the comparison between opinions of locals and tourists, sentiment analysis, and its association with the values and attributes of windcatchers. Results indicate that the age value is recognized as the most important value by all interest groups, while the political value is the least acknowledged. Besides, the negative sentiments are scarcely reflected (e.g., critiques) in social media. Results confirm the potential of social media for heritage management in terms of (de)coding and measuring the cultural significance of built heritage for windcatchers in Yazd. The methodology developed in this paper can be applied to other attributes in Yazd and also to other case studies.Keywords: social media, artificial intelligence, public participation, cultural significance, heritage, sentiment analysis
Procedia PDF Downloads 1172175 Collaboration During Planning and Reviewing in Writing: Effects on L2 Writing
Authors: Amal Sellami, Ahlem Ammar
Abstract:
Writing is acknowledged to be a cognitively demanding and complex task. Indeed, the writing process is composed of three iterative sub-processes, namely planning, translating (writing), and reviewing. Not only do second or foreign language learners need to write according to this process, but they also need to respect the norms and rules of language and writing in the text to-be-produced. Accordingly, researchers have suggested to approach writing as a collaborative task in order to al leviate its complexity. Consequently, collaboration has been implemented during the whole writing process or only during planning orreviewing. Researchers report that implementing collaboration during the whole process might be demanding in terms of time in comparison to individual writing tasks. Consequently, because of time constraints, teachers may avoid it. For this reason, it might be pedagogically more realistic to limit collaboration to one of the writing sub-processes(i.e., planning or reviewing). However, previous research implementing collaboration in planning or reviewing is limited and fails to explore the effects of the seconditionson the written text. Consequently, the present study examines the effects of collaboration in planning and collaboration in reviewing on the written text. To reach this objective, quantitative as well as qualitative methods are deployed to examine the written texts holistically and in terms of fluency, complexity, and accuracy. Participants of the study include 4 pairs in each group (n=8). They participated in two experimental conditions, which are: (1) collaborative planning followed by individual writing and individual reviewing and (2) individual planning followed by individual writing and collaborative reviewing. The comparative research findings indicate that while collaborative planning resulted in better overall text quality (precisely better content and organization ratings), better fluency, better complexity, and fewer lexical errors, collaborative reviewing produces better accuracy and less syntactical and mechanical errors. The discussion of the findings suggests the need to conduct more comparative research in order to further explore the effects of collaboration in planning or in reviewing. Pedagogical implications of the current study include advising teachers to choose between implementing collaboration in planning or in reviewing depending on their students’ need and what they need to improve.Keywords: collaboration, writing, collaborative planning, collaborative reviewing
Procedia PDF Downloads 992174 Simulation of Hamming Coding and Decoding for Microcontroller Radiation Hardening
Authors: Rehab I. Abdul Rahman, Mazhar B. Tayel
Abstract:
This paper presents a method of hardening the 8051 microcontroller, that able to assure reliable operation in the presence of bit flips caused by radiation. Aiming at avoiding such faults in the 8051 microcontroller, Hamming code protection was used in its SRAM memory and registers. A VHDL code and its simulation have been used for this hamming code protection.Keywords: radiation, hardening, bitflip, hamming
Procedia PDF Downloads 5022173 Sensitive Detection of Nano-Scale Vibrations by the Metal-Coated Fiber Tip at the Liquid-Air Interface
Authors: A. J. Babajanyan, T. A. Abrahamyan, H. A. Minasyan, K. V. Nerkararyan
Abstract:
Optical radiation emitted from a metal-coated fiber tip apex at liquid-air interface was measured. The intensity of the output radiation was strongly depending on the relative position of the tip to a liquid-air interface and varied with surface fluctuations. This phenomenon permits in-situ real-time investigation of nano-metric vibrations of the liquid surface and provides a basis for development of various origin ultrasensitive vibration detecting sensors. The described method can be used for detection of week seismic vibrations.Keywords: fiber-tip, liquid-air interface, nano vibration, opto-mechanical sensor
Procedia PDF Downloads 4842172 L1-Convergence of Modified Trigonometric Sums
Authors: Sandeep Kaur Chouhan, Jatinderdeep Kaur, S. S. Bhatia
Abstract:
The existence of sine and cosine series as a Fourier series, their L1-convergence seems to be one of the difficult question in theory of convergence of trigonometric series in L1-metric norm. In the literature so far available, various authors have studied the L1-convergence of cosine and sine trigonometric series with special coefficients. In this paper, we present a modified cosine and sine sums and criterion for L1-convergence of these modified sums is obtained. Also, a necessary and sufficient condition for the L1-convergence of the cosine and sine series is deduced as corollaries.Keywords: conjugate Dirichlet kernel, Dirichlet kernel, L1-convergence, modified sums
Procedia PDF Downloads 3552171 Second-Order Complex Systems: Case Studies of Autonomy and Free Will
Authors: Eric Sanchis
Abstract:
Although there does not exist a definitive consensus on a precise definition of a complex system, it is generally considered that a system is complex by nature. The presented work illustrates a different point of view: a system becomes complex only with regard to the question posed to it, i.e., with regard to the problem which has to be solved. A complex system is a couple (question, object). Because the number of questions posed to a given object can be potentially substantial, complexity does not present a uniform face. Two types of complex systems are clearly identified: first-order complex systems and second-order complex systems. First-order complex systems physically exist. They are well-known because they have been studied by the scientific community for a long time. In second-order complex systems, complexity results from the system composition and its articulation that are partially unknown. For some of these systems, there is no evidence of their existence. Vagueness is the keyword characterizing this kind of systems. Autonomy and free will, two mental productions of the human cognitive system, can be identified as second-order complex systems. A classification based on the properties structure makes it possible to discriminate complex properties from the others and to model this kind of second order complex systems. The final outcome is an implementable synthetic property that distinguishes the solid aspects of the actual property from those that are uncertain.Keywords: autonomy, free will, synthetic property, vaporous complex systems
Procedia PDF Downloads 2052170 The Impact of Social Emotional Learning and Conflict Resolution Skills
Authors: Paula Smith
Abstract:
During adolescence, many students engage in maladaptive behaviors that may reflect a lack of knowledge in social-emotional skills. Oftentimes these behaviors lead to conflicts and school-related disciplinary actions. Therefore, conflict resolution skills are vital for academic and social success. Conflict resolution is one component of a social-emotional learning (SEL) pedagogy that can effectively reduce discipline referrals and build students' social-emotional capacity. This action research study utilized a researcher-developed virtual SEL curriculum to provide instruction to eight adolescent students in an urban school in New York City with the goal of fostering their emotional intelligence (EI), reducing aggressive behaviors, and supporting instruction beyond the core academic content areas. Adolescent development, EI, and SEL frameworks were used to formulate this curriculum. Using a qualitative approach, this study inquired into how effectively participants responded to SEL instruction offered in virtual, Zoom-based workshops. Data included recorded workshop sessions, researcher field notes, and Zoom transcripts. Descriptive analysis involved manual coding/re-coding of transcripts to understand participants’ lived experience with conflict and the ideas presented in the workshops. Findings highlighted several themes and cultural norms that provided insight into adolescents' lived experiences and helped explain their past ideas about conflict. Findings also revealed participants' perspectives about the importance of SEL skills. This study illustrates one example of how evidence-based SEL programs might offer adolescents an opportunity to share their lived experiences. Programs such as this also address both individual and group needs, enabling practitioners to help students develop practical conflict resolution skills.Keywords: social, emotional, learning, conflict, resolution
Procedia PDF Downloads 172169 Prediction of Mental Health: Heuristic Subjective Well-Being Model on Perceived Stress Scale
Authors: Ahmet Karakuş, Akif Can Kilic, Emre Alptekin
Abstract:
A growing number of studies have been conducted to determine how well-being may be predicted using well-designed models. It is necessary to investigate the backgrounds of features in order to construct a viable Subjective Well-Being (SWB) model. We have picked the suitable variables from the literature on SWB that are acceptable for real-world data instructions. The goal of this work is to evaluate the model by feeding it with SWB characteristics and then categorizing the stress levels using machine learning methods to see how well it performs on a real dataset. Despite the fact that it is a multiclass classification issue, we have achieved significant metric scores, which may be taken into account for a specific task.Keywords: machine learning, multiclassification problem, subjective well-being, perceived stress scale
Procedia PDF Downloads 1332168 Electron Density Discrepancy Analysis of Energy Metabolism Coenzymes
Authors: Alan Luo, Hunter N. B. Moseley
Abstract:
Many macromolecular structure entries in the Protein Data Bank (PDB) have a range of regional (localized) quality issues, be it derived from x-ray crystallography, Nuclear Magnetic Resonance (NMR) spectroscopy, or other experimental approaches. However, most PDB entries are judged by global quality metrics like R-factor, R-free, and resolution for x-ray crystallography or backbone phi-psi distribution statistics and average restraint violations for NMR. Regional quality is often ignored when PDB entries are re-used for a variety of structurally based analyses. The binding of ligands, especially ligands involved in energy metabolism, is of particular interest in many structurally focused protein studies. Using a regional quality metric that provides chemically interpretable information from electron density maps, a significant number of outliers in regional structural quality was detected across x-ray crystallographic PDB entries for proteins bound to biochemically critical ligands. In this study, a series of analyses was performed to evaluate both specific and general potential factors that could promote these outliers. In particular, these potential factors were the minimum distance to a metal ion, the minimum distance to a crystal contact, and the isotropic atomic b-factor. To evaluate these potential factors, Fisher’s exact tests were performed, using regional quality criteria of outlier (top 1%, 2.5%, 5%, or 10%) versus non-outlier compared to a potential factor metric above versus below a certain outlier cutoff. The results revealed a consistent general effect from region-specific normalized b-factors but no specific effect from metal ion contact distances and only a very weak effect from crystal contact distance as compared to the b-factor results. These findings indicate that no single specific potential factor explains a majority of the outlier ligand-bound regions, implying that human error is likely as important as these other factors. Thus, all factors, including human error, should be considered when regions of low structural quality are detected. Also, the downstream re-use of protein structures for studying ligand-bound conformations should screen the regional quality of the binding sites. Doing so prevents misinterpretation due to the presence of structural uncertainty or flaws in regions of interest.Keywords: biomacromolecular structure, coenzyme, electron density discrepancy analysis, x-ray crystallography
Procedia PDF Downloads 1312167 Analysis of NMDA Receptor 2B Subunit Gene (GRIN2B) mRNA Expression in the Peripheral Blood Mononuclear Cells of Alzheimer's Disease Patients
Authors: Ali̇ Bayram, Semih Dalkilic, Remzi Yigiter
Abstract:
N-methyl-D-aspartate (NMDA) receptor is a subtype of glutamate receptor and plays a pivotal role in learning, memory, neuronal plasticity, neurotoxicity and synaptic mechanisms. Animal experiments were suggested that glutamate-induced excitotoxic injuriy and NMDA receptor blockage lead to amnesia and other neurodegenerative diseases including Alzheimer’s disease (AD), Huntington’s disease, amyotrophic lateral sclerosis. Aim of this study is to investigate association between NMDA receptor coding gene GRIN2B expression level and Alzheimer disease. The study was approved by the local ethics committees, and it was conducted according to the principles of the Declaration of Helsinki and guidelines for the Good Clinical Practice. Peripheral blood was collected 50 patients who diagnosed AD and 49 healthy control individuals. Total RNA was isolated with RNeasy midi kit (Qiagen) according to manufacturer’s instructions. After checked RNA quality and quantity with spectrophotometer, GRIN2B expression levels were detected by quantitative real time PCR (QRT-PCR). Statistical analyses were performed, variance between two groups were compared with Mann Whitney U test in GraphpadInstat algorithm with 95 % confidence interval and p < 0.05. After statistical analyses, we have determined that GRIN2B expression levels were down regulated in AD patients group with respect to control group. But expression level of this gene in each group was showed high variability. İn this study, we have determined that NMDA receptor coding gene GRIN2B expression level was down regulated in AD patients when compared with healthy control individuals. According to our results, we have speculated that GRIN2B expression level was associated with AD. But it is necessary to validate these results with bigger sample size.Keywords: Alzheimer’s disease, N-methyl-d-aspartate receptor, NR2B, GRIN2B, mRNA expression, RT-PCR
Procedia PDF Downloads 3942166 DNA Nano Wires: A Charge Transfer Approach
Authors: S. Behnia, S. Fathizadeh, A. Akhshani
Abstract:
In the recent decades, DNA has increasingly interested in the potential technological applications that not directly related to the coding for functional proteins that is the expressed in form of genetic information. One of the most interesting applications of DNA is related to the construction of nanostructures of high complexity, design of functional nanostructures in nanoelectronical devices, nanosensors and nanocercuits. In this field, DNA is of fundamental interest to the development of DNA-based molecular technologies, as it possesses ideal structural and molecular recognition properties for use in self-assembling nanodevices with a definite molecular architecture. Also, the robust, one-dimensional flexible structure of DNA can be used to design electronic devices, serving as a wire, transistor switch, or rectifier depending on its electronic properties. In order to understand the mechanism of the charge transport along DNA sequences, numerous studies have been carried out. In this regard, conductivity properties of DNA molecule could be investigated in a simple, but chemically specific approach that is intimately related to the Su-Schrieffer-Heeger (SSH) model. In SSH model, the non-diagonal matrix element dependence on intersite displacements is considered. In this approach, the coupling between the charge and lattice deformation is along the helix. This model is a tight-binding linear nanoscale chain established to describe conductivity phenomena in doped polyethylene. It is based on the assumption of a classical harmonic interaction between sites, which is linearly coupled to a tight-binding Hamiltonian. In this work, the Hamiltonian and corresponding motion equations are nonlinear and have high sensitivity to initial conditions. Then, we have tried to move toward the nonlinear dynamics and phase space analysis. Nonlinear dynamics and chaos theory, regardless of any approximation, could open new horizons to understand the conductivity mechanism in DNA. For a detailed study, we have tried to study the current flowing in DNA and investigated the characteristic I-V diagram. As a result, It is shown that there are the (quasi-) ohmic areas in I-V diagram. On the other hand, the regions with a negative differential resistance (NDR) are detectable in diagram.Keywords: DNA conductivity, Landauer resistance, negative dierential resistance, Chaos theory, mean Lyapunov exponent
Procedia PDF Downloads 4262165 Verb Bias in Mandarin: The Corpus Based Study of Children
Authors: Jou-An Chung
Abstract:
The purpose of this study is to investigate the verb bias of the Mandarin verbs in children’s reading materials and provide the criteria for categorization. Verb bias varies cross-linguistically. As Mandarin and English are typological different, this study hopes to shed light on Mandarin verb bias with the use of corpus and provide thorough and detailed criteria for analysis. Moreover, this study focuses on children’s reading materials since it is a significant issue in understanding children’s sentence processing. Therefore, investigating verb bias of Mandarin verbs in children’s reading materials is also an important issue and can provide further insights into children’s sentence processing. The small corpus is built up for this study. The corpus consists of the collection of school textbooks and Mandarin Daily News for children. The files are then segmented and POS tagged by JiebaR (Chinese segmentation with R). For the ease of analysis, the one-word character verbs and intransitive verbs are excluded beforehand. The total of 20 high frequency verbs are hand-coded and are further categorized into one of the three types, namely DO type, SC type and other category. If the frequency of taking Other Type exceeds the threshold of 25%, the verb is excluded from the study. The results show that 10 verbs are direct object bias verbs, and six verbs are sentential complement bias verbs. The paired T-test was done to assure the statistical significance (p = 0.0001062 for DO bias verb, p=0.001149 for SC bias verb). The result has shown that in children’s reading materials, the DO biased verbs are used more than the SC bias verbs since the simplest structure of sentences is easier for children’s sentence comprehension or processing. In sum, this study not only discussed verb bias in child's reading materials but also provided basic coding criteria for verb bias analysis in Mandarin and underscored the role of context. Sentences are easier for children’s sentence comprehension or processing. In sum, this study not only discussed verb bias in child corpus, but also provided basic coding criteria for verb bias analysis in Mandarin and underscored the role of context.Keywords: corpus linguistics, verb bias, child language, psycholinguistics
Procedia PDF Downloads 2932164 Factors Influencing the Adoption of Social Media as a Medium of Public Service Broadcasting
Authors: Seyed Mohammadbagher Jafari, Izmeera Shiham, Masoud Arianfar
Abstract:
The increased usage of Social media for different uses in turn makes it important to develop an understanding of users and their attitudes toward these sites, and moreover, the uses of such sites in a broader perspective such as broadcasting. This quantitative study addressed the problem of factors influencing the adoption of social media as a medium of public service broadcasting in the Republic of Maldives. These powerful and increasingly usable tools, accompanied by large public social media datasets, are bringing in a golden age of social science by empowering researchers to measure social behavior on a scale never before possible. This was conducted by exploring social responses on the use of social media. Research model was developed based on the previous models such as TAM, DOI and Trust combined model. It evaluates the influence of perceived ease of use, perceived usefulness, trust, complexity, compatibility and relative advantage influence on the adoption of social Media. The model was tested on a sample of 365 Maldivian people using survey method via questionnaire. The result showed that perceived usefulness, trust, relative advantage and complexity would highly influence the adoption of social media.Keywords: adoption, broadcasting, maldives, social media
Procedia PDF Downloads 4842163 Tip-Apex Distance as a Long-Term Risk Factor for Hospital Readmission Following Intramedullary Fixation of Intertrochanteric Fractures
Authors: Brandon Knopp, Matthew Harris
Abstract:
Purpose: Tip-apex distance (TAD) has long been discussed as a metric for determining risk of failure in the fixation of peritrochanteric fractures. TAD measurements over 25 millimeters (mm) have been associated with higher rates of screw cut out and other complications in the first several months after surgery. However, there is limited evidence for the efficacy of this measurement in predicting the long-term risk of negative outcomes following hip fixation surgery. The purpose of our study was to investigate risk factors including TAD for hospital readmission, loss of pre-injury ambulation and development of complications within 1 year after hip fixation surgery. Methods: A retrospective review of proximal hip fractures treated with single screw intramedullary devices between 2016 and 2020 was performed at a 327-bed regional medical center. Patients included had a postoperative follow-up of at least 12 months or surgery-related complications developing within that time. Results: 44 of the 67 patients in this study met the inclusion criteria with adequate follow-up post-surgery. There was a total of 10 males (22.7%) and 34 females (77.3%) meeting inclusion criteria with a mean age of 82.1 (± 12.3) at the time of surgery. The average TAD in our study population was 19.57mm and the average 1-year readmission rate was 15.9%. 3 out of 6 patients (50%) with a TAD > 25mm were readmitted within one year due to surgery-related complications. In contrast, 3 out of 38 patients (7.9%) with a TAD < 25mm were readmitted within one year due to surgery-related complications (p=0.0254). Individual TAD measurements, averaging 22.05mm in patients readmitted within 1 year of surgery and 19.18mm in patients not readmitted within 1 year of surgery, were not significantly different between the two groups (p=0.2113). Conclusions: Our data indicate a significant improvement in hospital readmission rates up to one year after hip fixation surgery in patients with a TAD < 25mm with a decrease in readmissions of over 40% (50% vs 7.9%). This result builds upon past investigations by extending the follow-up time to 1 year after surgery and utilizing hospital readmissions as a metric for surgical success. With the well-documented physical and financial costs of hospital readmission after hip surgery, our study highlights a reduction of TAD < 25mm as an effective method of improving patient outcomes and reducing financial costs to patients and medical institutions. No relationship was found between TAD measurements and secondary outcomes, including loss of pre-injury ambulation and development of complications.Keywords: hip fractures, hip reductions, readmission rates, open reduction internal fixation
Procedia PDF Downloads 145