Search results for: classical reasoning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1283

Search results for: classical reasoning

1043 Geospatial Land Suitability Modeling for Biofuel Crop Using AHP

Authors: Naruemon Phongaksorn

Abstract:

The biofuel consumption has increased significantly over the decade resulting in the increasing request on agricultural land for biofuel feedstocks. However, the biofuel feedstocks are already stressed of having low productivity owing to inappropriate agricultural practices without considering suitability of crop land. This research evaluates the land suitability using GIS-integrated Analytic Hierarchy Processing (AHP) of biofuel crops: cassava, at Chachoengsao province, in Thailand. AHP method that has been widely accepted for land use planning. The objective of this study is compared between AHP method and the most limiting group of land characteristics method (classical approach). The reliable results of the land evaluation were tested against the crop performance assessed by the field investigation in 2015. In addition to the socio-economic land suitability, the expected availability of raw materials for biofuel production to meet the local biofuel demand, are also estimated. The results showed that the AHP could classify and map the physical land suitability with 10% higher overall accuracy than the classical approach. The Chachoengsao province showed high and moderate socio-economic land suitability for cassava. Conditions in the Chachoengsao province were also favorable for cassava plantation, as the expected raw material needed to support ethanol production matched that of ethanol plant capacity of this province. The GIS integrated AHP for biofuel crops land suitability evaluation appears to be a practical way of sustainably meeting biofuel production demand.

Keywords: Analytic Hierarchy Processing (AHP), Cassava, Geographic Information Systems, Land suitability

Procedia PDF Downloads 185
1042 An Alternative to Problem-Based Learning in a Post-Graduate Healthcare Professional Programme

Authors: Brogan Guest, Amy Donaldson-Perrott

Abstract:

The Master’s of Physician Associate Studies (MPAS) programme at St George’s, University of London (SGUL), is an intensive two-year course that trains students to become physician associates (PAs). PAs are generalized healthcare providers who work in primary and secondary care across the UK. PA programmes face the difficult task of preparing students to become safe medical providers in two short years. Our goal is to teach students to develop clinical reasoning early on in their studies and historically, this has been done predominantly though problem-based learning (PBL). We have had an increase concern about student engagement in PBL and difficulty recruiting facilitators to maintain the low student to facilitator ratio required in PBL. To address this issue, we created ‘Clinical Application of Anatomy and Physiology (CAAP)’. These peer-led, interactive, problem-based, small group sessions were designed to facilitate students’ clinical reasoning skills. The sessions were designed using the concept of Team-Based Learning (TBL). Students were divided into small groups and each completed a pre-session quiz consisting of difficult questions devised to assess students’ application of medical knowledge. The quiz was completed in small groups and they were not permitted access of external resources. After the quiz, students worked through a series of openended, clinical tasks using all available resources. They worked at their own pace and the session was peer-led, rather than facilitator-driven. For a group of 35 students, there were two facilitators who observed the sessions. The sessions utilised an infinite space whiteboard software. Each group member was encouraged to actively participate and work together to complete the 15-20 tasks. The session ran for 2 hours and concluded with a post-session quiz, identical to the pre-session quiz. We obtained subjective feedback from students on their experience with CAAP and evaluated the objective benefit of the sessions through the quiz results. Qualitative feedback from students was generally positive with students feeling the sessions increased engagement, clinical understanding, and confidence. They found the small group aspect beneficial and the technology easy to use and intuitive. They also liked the benefit of building a resource for their future revision, something unique to CAAP compared to PBL, which out students participate in weekly. Preliminary quiz results showed improvement from pre- and post- session; however, further statistical analysis will occur once all sessions are complete (final session to run December 2022) to determine significance. As a post-graduate healthcare professional programme, we have a strong focus on self-directed learning. Whilst PBL has been a mainstay in our curriculum since its inception, there are limitations and concerns about its future in view of student engagement and facilitator availability. Whilst CAAP is not TBL, it draws on the benefits of peer-led, small group work with pre- and post- team-based quizzes. The pilot of these sessions has shown that students are engaged by CAAP, and they can make significant progress in clinical reasoning in a short amount of time. This can be achieved with a high student to facilitator ratio.

Keywords: problem based learning, team based learning, active learning, peer-to-peer teaching, engagement

Procedia PDF Downloads 71
1041 Blade Runner and Slavery in the 21st Century

Authors: Bülent Diken

Abstract:

This paper looks to set Ridley Scott’s original film Blade Runner (1982) and Denis Villeneuve’s Blade Runner 2049 (2017) in order to provide an analysis of both films with respect to the new configurations of slavery in the 21st century. Both Blade Runner films present a de-politicized society that oscillates between two extremes: the spectral (the eye, optics, digital communications) and the biopolitical (the body, haptics). On the one hand, recognizing the subject only as a sign, the society of the spectacle registers, identifies, produces and reproduces the subject as a code. At the same time, though, the subject is constantly reduced to a naked body, to bare life, for biometric technologies to scan it as a biological body or body parts. Being simultaneously a pure code (word without body) and an instrument slave (body without word), the replicants are thus the paradigmatic subjects of this society. The paper focuses first on the similarity: both films depict a relationship between masters and slaves, that is, a despotic relationship. The master uses the (body of the) slave as an instrument, as an extension of his own body. Blade Runner 2019 frames the despotic relation in this classical way through its triangulation with the economy (the Tyrell Corporation) and the slave-replicants’ dissent (rejecting their reduction to mere instruments). In a counter-classical approach, in Blade Runner 2049, the focus shifts to another triangulation: despotism, economy (the Wallace Corporation) and consent (of replicants who no longer perceive themselves as slaves).

Keywords: Blade Runner, the spectacle, bio-politics, slavery, imstrumentalisation

Procedia PDF Downloads 59
1040 Induced Chemistry for Dissociative Electron Attachment to Focused Electron Beam Induced Deposition Precursors Based on Ti, Si and Fe Metal Elements

Authors: Maria Pintea, Nigel Mason

Abstract:

Induced chemistry is one of the newest pathways in the nanotechnology field with applications in the focused electron beam induced processes for deposition of nm scale structures. Si(OPr)₄ and Ti(OEt)₄ are two of the precursors that have not been so extensively researched, though highly sought for semiconductor and medical applications fields, the two compounds make good candidates for FEBIP and are the subject of velocity slice map imaging analysis for deposition purposes, offering information on kinetic energies, fragmentation channels, and angular distributions. The velocity slice map imaging technique is a method used for the characterization of molecular dynamics of the molecule and the fragmentation channels as a result of induced chemistry. To support the gas-phase analysis, Meso-Bio-Nano simulations of irradiation dynamics studies are employed with final results on Fe(CO)₅ deposited on various substrates. The software is capable of running large scale simulations for complex biomolecular, nano- and mesoscopic systems with applications to thermos-mechanical DNA damage, complex materials, gases, nanoparticles for cancer research and deposition applications for nanotechnology, using a large library of classical potentials, many-body force fields, molecular force fields involved in the classical molecular dynamics.

Keywords: focused electron beam induced deposition, FEBID, induced chemistry, molecular dynamics, velocity map slice imaging

Procedia PDF Downloads 99
1039 Composite Laminate and Thin-Walled Beam Correlations for Aircraft Wing Box Design

Authors: S. J. M. Mohd Saleh, S. Guo

Abstract:

Composite materials have become an important option for the primary structure of aircraft due to their design flexibility and ability to improve the overall performance. At present, the option for composite usage in aircraft component is largely based on experience, knowledge, benchmarking and partly market driven. An inevitable iterative design during the design stage and validation process will increase the development time and cost. This paper aims at presenting the correlation between laminate and composite thin-wall beam structure, which contains the theoretical and numerical investigations on stiffness estimation of composite aerostructures with applications to aircraft wings. Classical laminate theory and thin-walled beam theory were applied to define the correlation between 1-dimensional composite laminate and 2-dimensional composite beam structure, respectively. Then FE model was created to represent the 3-dimensional structure. A detailed study on stiffness matrix of composite laminates has been carried out to understand the effects of stacking sequence on the coupling between extension, shear, bending and torsional deformation of wing box structures for 1-dimensional, 2-dimensional and 3-dimensional structures. Relationships amongst composite laminates and composite wing box structures of the same material have been developed in this study. These correlations will be guidelines for the design engineers to predict the stiffness of the wing box structure during the material selection process and laminate design stage.

Keywords: aircraft design, aircraft structures, classical lamination theory, composite structures, laminate theory, structural design, thin-walled beam theory, wing box design

Procedia PDF Downloads 218
1038 Noncommutative Differential Structure on Finite Groups

Authors: Ibtisam Masmali, Edwin Beggs

Abstract:

In this paper, we take example of differential calculi, on the finite group A4. Then, we apply methods of non-commutative of non-commutative differential geometry to this example, and see how similar the results are to those of classical differential geometry.

Keywords: differential calculi, finite group A4, Christoffel symbols, covariant derivative, torsion compatible

Procedia PDF Downloads 238
1037 Interdisciplinary Approach in Vocational Training for Orthopaedic Surgery

Authors: Mihail Nagea, Olivera Lupescu, Elena Taina Avramescu, Cristina Patru

Abstract:

Classical education of orthopedic surgeons involves lectures, self study, workshops and cadaver dissections, and sometimes supervised practical training within surgery, which quite seldom gives the young surgeons the feeling of being unable to apply what they have learned especially in surgical practice. The purpose of this paper is to present a different approach from the classical one, which enhances the practical skills of the orthopedic trainees and prepare them for future practice. The paper presents the content of the research project 2015-1-RO01-KA202-015230, ERASMUS+ VET ‘Collaborative learning for enhancing practical skills for patient-focused interventions in gait rehabilitation after orthopedic surgery’ which, using e learning as a basic tool , delivers to the trainees not only courses, but especially practical information through videos and case scenarios including gait analysis in order to build patient focused therapeutic plans, adapted to the characteristics of each patient. The outcome of this project is to enhance the practical skills in orthopedic surgery and the results are evaluated following the answers to the questionnaires, but especially the reactions within the case scenarios. The participants will thus follow the idea that any mistake within solving the cases might represent a failure of treating a real patient. This modern approach, besides using interactivity to evaluate the theoretical and practical knowledge of the trainee, increases the sense of responsibility, as well as the ability to react properly in real cases.

Keywords: interdisciplinary approach, gait analysis, orthopedic surgery, vocational training

Procedia PDF Downloads 242
1036 Corrosion Behvaior of CS1018 in Various CO2 Capture Solvents

Authors: Aida Rafat, Ramazan Kahraman, Mert Atilhan

Abstract:

The aggressive corrosion behavior of conventional amine solvents is one of main barriers against large scale commerizaliation of amine absorption process for carbon capture application. Novel CO2 absorbents that exhibit minimal corrosivity against operation conditions are essential to lower corrosion damage and control and ensure more robustness in the capture plant. This work investigated corrosion behavior of carbon steel CS1018 in various CO2 absrobent solvents. The tested solvents included the classical amines MEA, DEA and MDEA, piperazine activated solvents MEA/PZ, MDEA/PZ and MEA/MDEA/PZ as well as mixtures of MEA and Room Temperature Ionic Liquids RTIL, namely MEA/[C4MIM][BF4] and MEA/[C4MIM][Otf]. Electrochemical polarization technique was used to determine the system corrosiveness in terms of corrosion rate and polarization behavior. The process parameters of interest were CO2 loading and solution temperature. Electrochemical resulted showed corrosivity order of classical amines at 40°C is MDEA> MEA > DEA wherase at 80°C corrosivity ranking changes to MEA > DEA > MDEA. Corrosivity rankings were mainly governed by CO2 absorption capacity at the test temperature. Corrosivity ranking for activated amines at 80°C was MEA/PZ > MDEA/PZ > MEA/MDEA/PZ. Piperazine addition seemed to have a dual advanatge in terms of enhancing CO2 absorption capacity as well as nullifying corrosion. For MEA/RTIL mixtures, the preliminary results showed that the partial repalcement of aqueous phase in MEA solution by the more stable nonvolatile RTIL solvents reduced corrosion rates considerably.

Keywords: corrosion, amines, CO2 capture, piperazine, ionic liquids

Procedia PDF Downloads 449
1035 Economic Decision Making under Cognitive Load: The Role of Numeracy and Financial Literacy

Authors: Vânia Costa, Nuno De Sá Teixeira, Ana C. Santos, Eduardo Santos

Abstract:

Financial literacy and numeracy have been regarded as paramount for rational household decision making in the increasing complexity of financial markets. However, financial decisions are often made under sub-optimal circumstances, including cognitive overload. The present study aims to clarify how financial literacy and numeracy, taken as relevant expert knowledge for financial decision-making, modulate possible effects of cognitive load. Participants were required to perform a choice between a sure loss or a gambling pertaining a financial investment, either with or without a competing memory task. Two experiments were conducted varying only the content of the competing task. In the first, the financial choice task was made while maintaining on working memory a list of five random letters. In the second, cognitive load was based upon the retention of six random digits. In both experiments, one of the items in the list had to be recalled given its serial position. Outcomes of the first experiment revealed no significant main effect or interactions involving cognitive load manipulation and numeracy and financial literacy skills, strongly suggesting that retaining a list of random letters did not interfere with the cognitive abilities required for financial decision making. Conversely, and in the second experiment, a significant interaction between the competing mnesic task and level of financial literacy (but not numeracy) was found for the frequency of choice of a gambling option. Overall, and in the control condition, both participants with high financial literacy and high numeracy were more prone to choose the gambling option. However, and when under cognitive load, participants with high financial literacy were as likely as their illiterate counterparts to choose the gambling option. This outcome is interpreted as evidence that financial literacy prevents intuitive risk-aversion reasoning only under highly favourable conditions, as is the case when no other task is competing for cognitive resources. In contrast, participants with higher levels of numeracy were consistently more prone to choose the gambling option in both experimental conditions. These results are discussed in the light of the opposition between classical dual-process theories and fuzzy-trace theories for intuitive decision making, suggesting that while some instances of expertise (as numeracy) are prone to support easily accessible gist representations, other expert skills (as financial literacy) depend upon deliberative processes. It is furthermore suggested that this dissociation between types of expert knowledge might depend on the degree to which they are generalizable across disparate settings. Finally, applied implications of the present study are discussed with a focus on how it informs financial regulators and the importance and limits of promoting financial literacy and general numeracy.

Keywords: decision making, cognitive load, financial literacy, numeracy

Procedia PDF Downloads 167
1034 Need, Relevancy and Impact of Ethics Education in Accounting Profession

Authors: Mrigakshi Das

Abstract:

The ethics of a business is currently a high profile issue owing to sensational corporate scandals that had taken place in many countries causing extensive damages to the economy and society. These corporate scandals question the morality of businessmen in general and accountants in particular. It is argued that the accountants have been the main contributors to the decline in ethical standards of a business. This researcher has reviewed the need and impact of ethics education in accounting profession. Despite of ethical interventions, the rate of accounting scandals are increasing and have left the public questioning that has the profession become really less ethical?

Keywords: accounting, ethics education and intervention in accounting, accounting education, accounting profession, moral reasoning and development, ethics education

Procedia PDF Downloads 533
1033 Comparison of Parametric and Bayesian Survival Regression Models in Simulated and HIV Patient Antiretroviral Therapy Data: Case Study of Alamata Hospital, North Ethiopia

Authors: Zeytu G. Asfaw, Serkalem K. Abrha, Demisew G. Degefu

Abstract:

Background: HIV/AIDS remains a major public health problem in Ethiopia and heavily affecting people of productive and reproductive age. We aimed to compare the performance of Parametric Survival Analysis and Bayesian Survival Analysis using simulations and in a real dataset application focused on determining predictors of HIV patient survival. Methods: A Parametric Survival Models - Exponential, Weibull, Log-normal, Log-logistic, Gompertz and Generalized gamma distributions were considered. Simulation study was carried out with two different algorithms that were informative and noninformative priors. A retrospective cohort study was implemented for HIV infected patients under Highly Active Antiretroviral Therapy in Alamata General Hospital, North Ethiopia. Results: A total of 320 HIV patients were included in the study where 52.19% females and 47.81% males. According to Kaplan-Meier survival estimates for the two sex groups, females has shown better survival time in comparison with their male counterparts. The median survival time of HIV patients was 79 months. During the follow-up period 89 (27.81%) deaths and 231 (72.19%) censored individuals registered. The average baseline cluster of differentiation 4 (CD4) cells count for HIV/AIDS patients were 126.01 but after a three-year antiretroviral therapy follow-up the average cluster of differentiation 4 (CD4) cells counts were 305.74, which was quite encouraging. Age, functional status, tuberculosis screen, past opportunistic infection, baseline cluster of differentiation 4 (CD4) cells, World Health Organization clinical stage, sex, marital status, employment status, occupation type, baseline weight were found statistically significant factors for longer survival of HIV patients. The standard error of all covariate in Bayesian log-normal survival model is less than the classical one. Hence, Bayesian survival analysis showed better performance than classical parametric survival analysis, when subjective data analysis was performed by considering expert opinions and historical knowledge about the parameters. Conclusions: Thus, HIV/AIDS patient mortality rate could be reduced through timely antiretroviral therapy with special care on the potential factors. Moreover, Bayesian log-normal survival model was preferable than the classical log-normal survival model for determining predictors of HIV patients survival.

Keywords: antiretroviral therapy (ART), Bayesian analysis, HIV, log-normal, parametric survival models

Procedia PDF Downloads 177
1032 On the Problems of Human Concept Learning within Terminological Systems

Authors: Farshad Badie

Abstract:

The central focus of this article is on the fact that knowledge is constructed from an interaction between humans’ experiences and over their conceptions of constructed concepts. Logical characterisation of ‘human inductive learning over human’s constructed concepts’ within terminological systems and providing a logical background for theorising over the Human Concept Learning Problem (HCLP) in terminological systems are the main contributions of this research. This research connects with the topics ‘human learning’, ‘epistemology’, ‘cognitive modelling’, ‘knowledge representation’ and ‘ontological reasoning’.

Keywords: human concept learning, concept construction, knowledge construction, terminological systems

Procedia PDF Downloads 317
1031 From Colonial Outpost to Cultural India: Folk Epics of India

Authors: Jyoti Brahma

Abstract:

Folk epics of India are found in various Indian languages. The study of folk epics and its importance in folkloristic study in India came into prominence only during the nineteenth century. The British administrators and missionaries collected and documented folk epics from various parts of the country. The paper is an attempt to investigate how colonial outpost appears to penetrate the interiors of Indian land and society and triggered off the Indian Renaissance. It takes into account the compositions of the epics of India and the attention it received during the nineteenth century, which in turn gave, rise to the national consciousness shaping the culture of India. Composed as oral traditions these folk epics are now seen as repositories of historical consciousness whereas in earlier times societies without literacy were said to be without history. So, there is an urgent need to re-examine the British impact on Indian literary traditions. The Bhakti poets through their nuanced responses in their efforts to change the behavior of Indian society gives us the perfect example of deferment in the clear cut distinction between the folk and the classical in the context of India. It evades a pure categorization and classification of the classical and constitutes part of the folk traditions of the cultural heritage of India. Therefore, the ethical question of what is ontologically known as ordinary discourse in the case of the “folk” forms metaphors and folk language gains importance once more. The paper also thus seeks simultaneously to outline the significant factors responsible for shaping the destiny of folklore in South India particularly the four political states of the Indian Union: Andhra Pradesh, Karnataka, Kerala and Tamil Nadu, what could be termed as South Indian “cultural zones”.

Keywords: colonial, folk, folklore, tradition

Procedia PDF Downloads 300
1030 Math Rally Proposal for the Teaching-Learning of Algebra

Authors: Liliana O. Martínez, Juan E. González, Manuel Ramírez-Aranda, Ana Cervantes-Herrera

Abstract:

In this work, the use of a collection of mathematical challenges and puzzles aimed at students who are starting in algebra is proposed. The selected challenges and puzzles are intended to arouse students' interest in this area of mathematics, in addition to facilitating the teaching-learning process through challenges such as riddles, crossword puzzles, and board games, all in everyday situations that allow them to build themselves the learning. For this, it is proposed to carry out a "Math Rally: algebra" divided into four sections: mathematical reasoning, a hierarchy of operations, fractions, and algebraic equations.

Keywords: algebra, algebraic challenge, algebraic puzzle, math rally

Procedia PDF Downloads 149
1029 Mathematical and Fuzzy Logic in the Interpretation of the Quran

Authors: Morteza Khorrami

Abstract:

The logic as an intellectual infrastructure plays an essential role in the Islamic sciences. Hence, there are a few of the verses of the Holy Quran that their interpretation is not possible due to lack of proper logic. In many verses in the Quran, argument and the respondent has requested from the audience that shows the logic rule is in the Quran. The paper which use a descriptive and analytic method, tries to show the role of logic in understanding of the Quran reasoning methods and display some of Quranic statements with mathematical symbols and point that we can help these symbols for interesting and interpretation and answering to some questions and doubts. In this paper, this problem has been mentioned that the Quran did not use two-valued logic (Aristotelian) in all cases, but the fuzzy logic can also be searched in the Quran.

Keywords: aristotelian logic, fuzzy logic, interpretation, Holy Quran

Procedia PDF Downloads 652
1028 Modeling the Impact of Controls on Information System Risks

Authors: M. Ndaw, G. Mendy, S. Ouya

Abstract:

Information system risk management helps to reduce or eliminate risk by implementing appropriate controls. In this paper, we propose a quantification model of controls impact on information system risks by automatizing the residual criticality estimation step of FMECA which is based on a inductive reasoning. For this, we defined three equations based on type and maturity of controls. For testing, the values obtained with the model were compared to estimated values given by interlocutors during different working sessions and the result is satisfactory. This model allows an optimal assessment of controls maturity and facilitates risk analysis of information system.

Keywords: information system, risk, control, FMECA method

Procedia PDF Downloads 344
1027 Linguistic Analysis of Argumentation Structures in Georgian Political Speeches

Authors: Mariam Matiashvili

Abstract:

Argumentation is an integral part of our daily communications - formal or informal. Argumentative reasoning, techniques, and language tools are used both in personal conversations and in the business environment. Verbalization of the opinions requires the use of extraordinary syntactic-pragmatic structural quantities - arguments that add credibility to the statement. The study of argumentative structures allows us to identify the linguistic features that make the text argumentative. Knowing what elements make up an argumentative text in a particular language helps the users of that language improve their skills. Also, natural language processing (NLP) has become especially relevant recently. In this context, one of the main emphases is on the computational processing of argumentative texts, which will enable the automatic recognition and analysis of large volumes of textual data. The research deals with the linguistic analysis of the argumentative structures of Georgian political speeches - particularly the linguistic structure, characteristics, and functions of the parts of the argumentative text - claims, support, and attack statements. The research aims to describe the linguistic cues that give the sentence a judgmental/controversial character and helps to identify reasoning parts of the argumentative text. The empirical data comes from the Georgian Political Corpus, particularly TV debates. Consequently, the texts are of a dialogical nature, representing a discussion between two or more people (most often between a journalist and a politician). The research uses the following approaches to identify and analyze the argumentative structures Lexical Classification & Analysis - Identify lexical items that are relevant in argumentative texts creating process - Creating the lexicon of argumentation (presents groups of words gathered from a semantic point of view); Grammatical Analysis and Classification - means grammatical analysis of the words and phrases identified based on the arguing lexicon. Argumentation Schemas - Describe and identify the Argumentation Schemes that are most likely used in Georgian Political Speeches. As a final step, we analyzed the relations between the above mentioned components. For example, If an identified argument scheme is “Argument from Analogy”, identified lexical items semantically express analogy too, and they are most likely adverbs in Georgian. As a result, we created the lexicon with the words that play a significant role in creating Georgian argumentative structures. Linguistic analysis has shown that verbs play a crucial role in creating argumentative structures.

Keywords: georgian, argumentation schemas, argumentation structures, argumentation lexicon

Procedia PDF Downloads 61
1026 Indenyl and Allyl Palladates: Synthesis, Bonding, and Anticancer Activity

Authors: T. Scattolin, E. Cavarzerani, F. Visentin, F. Rizzolio

Abstract:

Organopalladium compounds have recently attracted attention for their high stability even under physiological conditions and, above all, for their remarkable in vitro cytotoxicity towards cisplatin-resistant cell lines. Among the organopalladium derivatives, those bearing at least one N-heterocyclic carbene ligand (NHC) and the Pd(II)-η³-allyl fragment have exhibited IC₅₀ values in the micro and sub-micromolar range towards several cancer cell lines in vitro and in some cases selectivity towards cancerous vs. non-tumorigenic cells. Herein, a selection of allyl and indenyl palladates were synthesized using a solvent-free method consisting of grinding the corresponding palladium precursors with different saturated and unsaturated azolium salts. All compounds have been fully characterized by NMR, XRD and elemental analyses. The intramolecular H, Cl interaction has been elucidated and quantified using the Voronoi Deformation Density scheme. Most of the complexes showed excellent cytotoxicity towards ovarian cancer cell lines, with I₅₀ values comparable to or even lower than cisplatin. Interestingly, the potent anticancer activity was also confirmed in a high-serous ovarian cancer (HGSOC) patient-derived tumoroid, with a clear superiority of this class of compounds over classical platinum-based agents. Finally, preliminary enzyme inhibition studies of the synthesized palladate complexes against the model TrxR show that the compounds have high activity comparable to or even higher than auranofin and classical Au(I) NHC complexes. Based on such promising data, further in vitro and in vivo experiments and in-depth mechanistic studies are ongoing in our laboratories.

Keywords: anticancer activity, palladium complexes, organoids, indenyl and allyl ligands

Procedia PDF Downloads 83
1025 The Development, Composition, and Implementation of Vocalises as a Method of Technical Training for the Adult Musical Theatre Singer

Authors: Casey Keenan Joiner, Shayna Tayloe

Abstract:

Classical voice training for the novice singer has long relied on the guidance and instruction of vocalise collections, such as those written and compiled by Marchesi, Lütgen, Vaccai, and Lamperti. These vocalise collections purport to encourage healthy vocal habits and instill technical longevity in both aspiring and established singers, though their scope has long been somewhat confined to the classical idiom. For pedagogues and students specializing in other vocal genres, such as musical theatre and CCM (contemporary commercial music,) low-impact and pertinent vocal training aids are in short supply, and much of the suggested literature derives from classical methodology. While the tenants of healthy vocal production remain ubiquitous, specific stylistic needs and technical emphases differ from genre to genre and may require a specified extension of vocal acuity. As musical theatre continues to grow in popularity at both the professional and collegiate levels, the need for specialized training grows as well. Pedagogical literature geared specifically towards musical theatre (MT) singing and vocal production, while relatively uncommon, is readily accessible to the contemporary educator. Practitioners such as Norman Spivey, Mary Saunders Barton, Claudia Friedlander, Wendy Leborgne, and Marci Rosenberg continue to publish relevant research in the field of musical theatre voice pedagogy and have successfully identified many common MT vocal faults, their subsequent diagnoses, and their eventual corrections. Where classical methodology would suggest specific vocalises or training exercises to maintain corrected vocal posture following successful fault diagnosis, musical theatre finds itself without a relevant body of work towards which to transition. By analyzing the existing vocalise literature by means of a specialized set of parameters, including but not limited to melodic variation, rhythmic complexity, vowel utilization, and technical targeting, we have composed a set of vocalises meant specifically to address the training and conditioning of adult musical theatre voices. These vocalises target many pedagogical tenants in the musical theatre genre, including but not limited to thyroarytenoid-dominant production, twang resonance, lateral vowel formation, and “belt-mix.” By implementing these vocalises in the musical theatre voice studio, pedagogues can efficiently communicate proper musical theatre vocal posture and kinesthetic connection to their students, regardless of age or level of experience. The composition of these vocalises serves MT pedagogues on both a technical level as well as a sociological one. MT is a relative newcomer on the collegiate stage and the academization of musical theatre methodologies has been a slow and arduous process. The conflation of classical and MT techniques and training methods has long plagued the world of voice pedagogy and teachers often find themselves in positions of “cross-training,” that is, teaching students of both genres in one combined voice studio. As MT continues to establish itself on academic platforms worldwide, genre-specific literature and focused studies are both rare and invaluable. To ensure that modern students receive exacting and definitive training in their chosen fields, it becomes increasingly necessary for genres such as musical theatre to boast specified literature and a collection of musical theatre-specific vocalises only aids in this effort. This collection of musical theatre vocalises is the first of its kind and provides genre-specific studios with a basis upon which to grow healthy, balanced voices built for the harsh conditions of the modern theatre stage.

Keywords: voice pedagogy, targeted methodology, musical theatre, singing

Procedia PDF Downloads 149
1024 A Study of Algebraic Structure Involving Banach Space through Q-Analogue

Authors: Abdul Hakim Khan

Abstract:

The aim of the present paper is to study the Banach Space and Combinatorial Algebraic Structure of R. It is further aimed to study algebraic structure of set of all q-extension of classical formula and function for 0 < q < 1.

Keywords: integral functions, q-extensions, q numbers of metric space, algebraic structure of r and banach space

Procedia PDF Downloads 570
1023 Building a Hierarchical, Granular Knowledge Cube

Authors: Alexander Denzler, Marcel Wehrle, Andreas Meier

Abstract:

A knowledge base stores facts and rules about the world that applications can use for the purpose of reasoning. By applying the concept of granular computing to a knowledge base, several advantages emerge. These can be harnessed by applications to improve their capabilities and performance. In this paper, the concept behind such a construct, called a granular knowledge cube, is defined, and its intended use as an instrument that manages to cope with different data types and detect knowledge domains is elaborated. Furthermore, the underlying architecture, consisting of the three layers of the storing, representing, and structuring of knowledge, is described. Finally, benefits as well as challenges of deploying it are listed alongside application types that could profit from having such an enhanced knowledge base.

Keywords: granular computing, granular knowledge, hierarchical structuring, knowledge bases

Procedia PDF Downloads 486
1022 Save Balance of Power: Can We?

Authors: Swati Arun

Abstract:

The present paper argues that Balance of Power (BOP) needs to conjugate with certain contingencies like geography. It is evident that sea powers (‘insular’ for better clarity) are not balanced (if at all) in the same way as land powers. Its apparent that artificial insularity that the US has achieved reduces the chances of balancing (constant) and helps it maintain preponderance (variable). But how precise is this approach in assessing the dynamics between China’s rise and reaction of other powers and US. The ‘evolved’ theory can be validated by putting China and US in the equation. Systemic Relation between the nations was explained through the Balance of Power theory much before the systems theory was propounded. The BOP is the crux of functionality of ‘power relation’ dynamics which has played its role in the most astounding ways leading to situations of war and peace. Whimsical; but true that, the BOP has remained a complicated and indefinable concepts since Hans. Morganthau to Kenneth Waltz. A challenge of the BOP, however remains; “ that it has too many meanings”. In the recent times it has become evident that the myriad of expectations generated by BOP has not met the practicality of the current world politics. It is for this reason; the BoP has been replaced by Preponderance Theory (PT) to explain prevailing power situation. PT does provide an empirical reasoning for the success of this theory but fails in a abstract logical reasoning required for making a theory universal. Unipolarity clarifies the current system as one where balance of power has become redundant. It seems to reach beyond the contours of BoP, where a superpower does what it must to remain one. The centrality of this arguments pivots around - an exception, every time BOP fails to operate, preponderance of power emerges. PT does not sit well with the primary logic of a theory because it works on an exception. The evolution of such a pattern and system where BOP fails and preponderance emerges is absent. The puzzle here is- if BOP really has become redundant or it needs polishing. The international power structure changed from multipolar to bipolar to unipolar. BOP was looked at to provide inevitable logic behind such changes and answer the dilemma we see today- why US is unchecked, unbalanced? But why was Britain unchecked in 19th century and why China was unbalanced in 13th century? It is the insularity of the state that makes BOP reproduce “imbalance of power”, going a level up from off-shore balancer. This luxury of a state to maintain imbalance in the region of competition or threat is the causal relation between BOP’s and geography. America has applied imbalancing- meaning disequilibrium (in its favor) to maintain the regional balance so that over time the weaker does not get stronger and pose a competition. It could do that due to the significant parity present between the US and the rest.

Keywords: balance of power, china, preponderance of power, US

Procedia PDF Downloads 266
1021 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: artificial intelligence, computer science, criminal investigation, digital forensics

Procedia PDF Downloads 197
1020 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data

Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa

Abstract:

A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.

Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation

Procedia PDF Downloads 188
1019 On Lie-Central Derivations and Almost Inner Lie-Derivations of Leibniz Algebras

Authors: Natalia Pacheco Rego

Abstract:

The Liezation functor is a map from the category of Leibniz algebras to the category of Lie algebras, which assigns a Leibniz algebra to the Lie algebra given by the quotient of the Leibniz algebra by the ideal spanned by the square elements of the Leibniz algebra. This functor is left adjoint to the inclusion functor that considers a Lie algebra as a Leibniz algebra. This environment fits in the framework of central extensions and commutators in semi-abelian categories with respect to a Birkhoff subcategory, where classical or absolute notions are relative to the abelianization functor. Classical properties of Leibniz algebras (properties relative to the abelianization functor) were adapted to the relative setting (with respect to the Liezation functor); in general, absolute properties have the corresponding relative ones, but not all absolute properties immediately hold in the relative case, so new requirements are needed. Following this line of research, it was conducted an analysis of central derivations of Leibniz algebras relative to the Liezation functor, called as Lie-derivations, and a characterization of Lie-stem Leibniz algebras by their Lie-central derivations was obtained. In this paper, we present an overview of these results, and we analyze some new properties concerning Lie-central derivations and almost inner Lie-derivations. Namely, a Leibniz algebra is a vector space equipped with a bilinear bracket operation satisfying the Leibniz identity. We define the Lie-bracket by [x, y]lie = [x, y] + [y, x] , for all x, y . The Lie-center of a Leibniz algebra is the two-sided ideal of elements that annihilate all the elements in the Leibniz algebra through the Lie-bracket. A Lie-derivation is a linear map which acts as a derivative with respect to the Lie-bracket. Obviously, usual derivations are Lie-derivations, but the converse is not true in general. A Lie-derivation is called a Lie-central derivation if its image is contained in the Lie-center. A Lie-derivation is called an almost inner Lie-derivation if the image of an element x is contained in the Lie-commutator of x and the Leibniz algebra. The main results we present in this talk refer to the conditions under which Lie-central derivation and almost inner Lie-derivations coincide.

Keywords: almost inner Lie-derivation, Lie-center, Lie-central derivation, Lie-derivation

Procedia PDF Downloads 127
1018 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 49
1017 Agency Beyond Metaphysics of Subjectivity

Authors: Erik Kuravsky

Abstract:

One of the problems with a post-structuralist account of agency is that it appears to reject the freedom of an acting subject, thus seeming to deny the very phenomenon of agency. However, this is only a problem if we think that human beings can be agents exclusively in terms of being subjects, that is, if we think agency subjectively. Indeed, we tend to understand traditional theories of human freedom (e.g., Plato’s or Kant’s) in terms of a peculiar ability of the subject. The paper suggests to de-subjectivize agency with the help of Heidegger’s later thought. To do it, ir argues that classical theories of agency may indeed be interpreted as subject-oriented (sometimes even by their authors), but do not have to be read as such. Namely, the claim is that what makes agency what it is, what is essential in agency, is not its belonginess to a subject, but its ontological configuration. We may say that agency “happens,” and that there is a very specific ontological characteristics to this happening. The argument of the paper is that we can find these characteristic in the classical accounts of agency and that these characteristics are sufficient to distinguish human freedom from other natural phenomena. In particular, it offers to think agency not as one of human characteristics, but as an ontological event in which human beings take part. Namely, agency is a (non-human) characteristic of the different modes in which the experienceable existence of beings is determined by Being. To be an agent then is to participate in such ontological determination. What enables this participation is the ways human beings non-thematically understand the ontological difference. For example, for Plato, one acts freely only if one is led by an idea of the good, while for Kant the imperative for free action is categorial. The agency of an agent is thus dependent on the differentiation between ideas/categories and beings met in experience – one is “free” from contingent sensibility in terms of what is different from it ontologically. In this light, modern dependence on subjectivity is evident in the fact that the ontological difference is thought as belonging to one’s thinking, consciousness etc. That is, it is taken subjectively. A non-subjective account of agency, on the other hand, requires thinking this difference as belonging to Being itself, and thinking human beings as a medium within which occurs the non-human force of ontological differentiation.

Keywords: Heidegger, freedom, agency, poststructuralism

Procedia PDF Downloads 189
1016 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism

Authors: Lubos Rojka

Abstract:

The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.

Keywords: consciousness, free will, determinism, emergence, moral responsibility

Procedia PDF Downloads 154
1015 Compensatory Neuro-Fuzzy Inference (CNFI) Controller for Bilateral Teleoperation

Authors: R. Mellah, R. Toumi

Abstract:

This paper presents a new adaptive neuro-fuzzy controller equipped with compensatory fuzzy control (CNFI) in order to not only adjusts membership functions but also to optimize the adaptive reasoning by using a compensatory learning algorithm. The proposed control structure includes both CNFI controllers for which one is used to control in force the master robot and the second one for controlling in position the slave robot. The experimental results obtained, show a fairly high accuracy in terms of position and force tracking under free space motion and hard contact motion, what highlights the effectiveness of the proposed controllers.

Keywords: compensatory fuzzy, neuro-fuzzy, control adaptive, teleoperation

Procedia PDF Downloads 313
1014 Historical Geography of Lykaonia Region

Authors: Asuman Baldiran, Erdener Pehlivan

Abstract:

In this study, the root of the name Lykaonia and the geographical area defined as Lykaonia Region are mentioned. In this context, information concerning the settlements of Paleolithic Age, Neolithic Age and Chalcolithic Age are given place. Particularly the settlements belonging to Classical Age are localized and brief information about the history of these settlements is provided. In the light of this information, roads of Antique period in the region are evaluated.

Keywords: ancient cities, central anatolia, historical geography, Lykaonia region

Procedia PDF Downloads 370