Search results for: claim verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 859

Search results for: claim verification

769 Beware the Trolldom: Speculative Interests and Policy Implications behind the Circulation of Damage Claims

Authors: Antonio Davola

Abstract:

Moving from the evaluations operated by Richard Posner in his judgment on the case Carhart v. Halaska, the paper seeks to analyse the so-called ‘litigation troll’ phenomenon and the development of a damage claims market, i.e. a market in which the right to propose claims is voluntary exchangeable for money and can be asserted by private buyers. The aim of our study is to assess whether the implementation of a ‘damage claims market’ might represent a resource for victims or if, on the contrary, it might operate solely as a speculation tool for private investors. The analysis will move from the US experience, and will then focus on the EU framework. Firstly, the paper will analyse the relation between the litigation troll phenomenon and the patent troll activity: even though these activities are considered similar by Posner, a comparative study shows how these practices significantly differ in their impact on the market and on consumer protection, even moving from similar economic perspectives. The second part of the paper will focus on the main specific concerns related to the litigation trolling activity. The main issues that will be addressed are the risk that the circulation of damage claims might spur non-meritorious litigation and the implications of the misalignment between the victim of a tort and the actual plaintiff in court arising from the sale of a claim. In its third part, the paper will then focus on the opportunities and benefits that the introduction and regulation of a claims market might imply both for potential claims sellers and buyers, in order to ultimately assess whether such a solution might actually increase individual’s legal empowerment. Through the damage claims market compensation would be granted more quickly and easily to consumers who had suffered harm: tort victims would, in fact, be compensated instantly upon the sale of their claims without any burden of proof. On the other hand, claim-buyers would profit from the gap between the amount that a consumer would accept for an immediate refund and the compensation awarded in court. In the fourth part of the paper, the analysis will focus on the legal legitimacy of the litigation trolling activity in the US and the EU framework. Even though there is no express provision that forbids the sale of the right to pursue a claim in court - or that deems such a right to be non-transferable – procedural laws of single States (especially in the EU panorama) must be taken into account in evaluating this aspect. The fifth and final part of the paper will summarize the various data collected to suggest an evaluation on if, and through which normative solutions, the litigation trolling might comport benefits for competition and which would be its overall effect over consumer’s protection.

Keywords: competition, claims, consumer's protection, litigation

Procedia PDF Downloads 212
768 The Bayesian Premium Under Entropy Loss

Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita

Abstract:

Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.

Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation

Procedia PDF Downloads 298
767 Use of Psychiatric Services and Psychotropics in Children with Atopic Dermatitis

Authors: Mia Schneeweiss, Joseph Merola

Abstract:

Atopic dermatitis (AD) is a chronic inflammatory skin condition with a prevalence of 9.6 million in children under the age of 18 in the US, 3.2 million of those suffer severe AD. AD has significant effects on the quality of life and psychiatric comorbidity in affected patients. We sought to quantify the use of psychotropic medications and mental health services in children. We used longitudinal claims data form commercially insured patients in the US between 2003 and 2016 to identify children aged 18 or younger with a diagnosis of AD associated with an outpatient or inpatient encounter. A 180-day enrollment period was required before the first diagnosis of AD. Among those diagnosed, we computed the use of psychiatric services and dispensing of psychotropic medications during the following 6 months. Among 1.6 million children <18 years with a diagnosis of AD, most were infants (0-1 years: 17.6%), babies (1-2 years: 12.2%) and young children (2-4 years: 15.4). 5.1% were in age group 16-18 years. Among younger children 50% of patients were female, after the age of 14 about 60% were female. In 16-18 years olds 6.4% had at least one claim with a recorded psychopathology during the 6-month baseline period; 4.6% had depression, 3.3% anxiety, 0.3% panic disorder, 0.6% psychotic disorder, 0.1% anorexia. During the 6 months following the physician diagnosis of AD, 66% used high-potency topical corticosteroids, 3.5% used an SSRI, 0.3% used an SNRI, 1.2% used a tricyclic antidepressant, 1.4% used an antipsychotic medication, and 5.2% used an anxiolytic agent. 4.4% had an outpatient visit with a psychiatrist and 0.1% had been hospitalized with a psychiatric diagnosis. In 14-16 years olds, 4.7% had at least one claim with a recorded psychopathology during the 6-month baseline period; 3.3% had depression, 2.5% anxiety, 0.2% panic disorder, 0.5% psychotic disorder, 0.1% anorexia. During the 6 months following the physician diagnosis of AD, 68% used high-potency topical corticosteroids, 4.6% used an SSRI, 0.6% used an SNRI, 1.5% used a tricyclic antidepressant, 1.4% used an antipsychotic medication, and 4.6% used an anxiolytic agent. 4.7% had an outpatient visit with a psychiatrist and 0.1% had been hospitalized with a psychiatric diagnosis. In 12-14 years olds, 3.3% had at least one claim with a recorded psychopathology during the 6-month baseline period; 1.9% had depression, 2.2% anxiety, 0.1% panic disorder, 0.7% psychotic disorder, 0.0% anorexia. During the 6 months following the physician diagnosis of AD, 67% used high-potency topical corticosteroids, 2.1% used an SSRI, 0.1% used an SNRI, 0.7% used a tricyclic antidepressant, 0.9 % used an antipsychotic medication, and 4.1% used an anxiolytic agent. 3.8% had an outpatient visit with a psychiatrist and 0.05% had been hospitalized with a psychiatric diagnosis. In younger children psychopathologies were decreasingly common: 10-12: 2.8%; 8-10: 2.3%; 6-8: 1.3%; 4-6: 0.6%. In conclusion, there is substantial psychiatric comorbidity among children, <18 years old, with diagnosed atopic dermatitis in a US commercially insured population. Meaningful psychiatric medication use (>3%) starts as early as 12 years old.

Keywords: pediatric atopic dermatitis, phychotropic medication use, psychiatric comorbidity, claims database

Procedia PDF Downloads 152
766 The Verification Study of Computational Fluid Dynamics Model of the Aircraft Piston Engine

Authors: Lukasz Grabowski, Konrad Pietrykowski, Michal Bialy

Abstract:

This paper presents the results of the research to verify the combustion in aircraft piston engine Asz62-IR. This engine was modernized and a type of ignition system was developed. Due to the high costs of experiments of a nine-cylinder 1,000 hp aircraft engine, a simulation technique should be applied. Therefore, computational fluid dynamics to simulate the combustion process is a reasonable solution. Accordingly, the tests for varied ignition advance angles were carried out and the optimal value to be tested on a real engine was specified. The CFD model was created with the AVL Fire software. The engine in the research had two spark plugs for each cylinder and ignition advance angles had to be set up separately for each spark. The results of the simulation were verified by comparing the pressure in the cylinder. The courses of the indicated pressure of the engine mounted on a test stand were compared. The real course of pressure was measured with an optical sensor, mounted in a specially drilled hole between the valves. It was the OPTRAND pressure sensor, which was designed especially to engine combustion process research. The indicated pressure was measured in cylinder no 3. The engine was running at take-off power. The engine was loaded by a propeller at a special test bench. The verification of the CFD simulation results was based on the results of the test bench studies. The course of the simulated pressure obtained is within the measurement error of the optical sensor. This error is 1% and reflects the hysteresis and nonlinearity of the sensor. The real indicated pressure measured in the cylinder and the pressure taken from the simulation were compared. It can be claimed that the verification of CFD simulations based on the pressure is a success. The next step was to research on the impact of changing the ignition advance timing of spark plugs 1 and 2 on a combustion process. Moving ignition timing between 1 and 2 spark plug results in a longer and uneven firing of a mixture. The most optimal point in terms of indicated power occurs when ignition is simultaneous for both spark plugs, but so severely separated ignitions are assured that ignition will occur at all speeds and loads of engine. It should be confirmed by a bench experiment of the engine. However, this simulation research enabled us to determine the optimal ignition advance angle to be implemented into the ignition control system. This knowledge allows us to set up the ignition point with two spark plugs to achieve as large power as possible.

Keywords: CFD model, combustion, engine, simulation

Procedia PDF Downloads 331
765 Development and Verification of the Idom Shielding Optimization Tool

Authors: Omar Bouhassoun, Cristian Garrido, César Hueso

Abstract:

The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.

Keywords: optimization, shielding, nuclear, genetic algorithm

Procedia PDF Downloads 80
764 Treating On-Demand Bonds as Cash-In-Hand: Analyzing the Use of “Unconscionability” as a Ground for Challenging Claims for Payment under On-Demand Bonds

Authors: Asanga Gunawansa, Shenella Fonseka

Abstract:

On-demand bonds, also known as unconditional bonds, are commonplace in the construction industry as a means of safeguarding the employer from any potential non-performance by a contractor. On-demand bonds may be obtained from commercial banks, and they serve as an undertaking by the issuing bank to honour payment on demand without questioning and/or considering any dispute between the employer and the contractor in relation to the underlying contract. Thus, whether or not a breach had occurred under the underlying contract, which triggers the demand for encashment by the employer, is not a question the bank needs to be concerned with. As a result, an unconditional bond allows the beneficiary to claim the money almost without any condition. Thus, an unconditional bond is as good as cash-in-hand. In the past, establishing fraud on the part of the employer, of which the bank had knowledge, was the only ground on which a bank could dishonour a claim made under an on-demand bond. However, recent jurisprudence in common law countries shows that courts are beginning to consider unconscionable conduct on the part of the employer in claiming under an on-demand bond as a ground that contractors could rely on the prevent the banks from honouring such claims. This has created uncertainty in connection with on-demand bonds and their liquidity. This paper analyzes recent judicial decisions in four common law jurisdictions, namely, England, Singapore, Hong Kong, and Sri Lanka, to identify the scope of using the concept of “unconscionability” as a ground for preventing unreasonable claims for encashment of on-demand bonds. The objective of this paper is to argue that on-demand bonds have lost their effectiveness as “cash-in-hand” and that this is, in fact, an advantage and not an impediment to international commerce, as the purpose of such bonds should not be to provide for illegal and unconscionable conduct by the beneficiaries.

Keywords: fraud, performance guarantees, on-demand bonds, unconscionability

Procedia PDF Downloads 68
763 Opacity Synthesis with Orwellian Observers

Authors: Moez Yeddes

Abstract:

The property of opacity is widely used in the formal verification of security in computer systems and protocols. Opacity is a general language-theoretic scheme of many security properties of systems. Opacity is parametrized with framework in which several security properties of a system can be expressed. A secret behaviour of a system is opaque if a passive attacker can never deduce its occurrence from the system observation. Instead of considering the case of static observability where the set of observable events is fixed off-line or dynamic observability where the set of observable events changes over time depending on the history of the trace, we introduce Orwellian partial observability where unobservable events are not revealed provided that downgrading events never occurs in the future of the trace. Orwellian partial observability is needed to model intransitive information flow. This Orwellian observability is knwon as ipurge function. We show in previous work how to verify opacity for regular secret is opaque for a regular language L w.r.t. an Orwellian projection is PSPACE-complete while it has been proved undecidable even for a regular language L w.r.t. a general Orwellian observation function. In this paper, we address two problems of opacification of a regular secret ϕ for a regular language L w.r.t. an Orwellian projection: Given L and a secret ϕ ∈ L, the first problem consist to compute some minimal regular super-language M of L, if it exists, such that ϕ is opaque for M and the second consists to compute the supremal sub-language M′ of L such that ϕ is opaque for M′. We derive both language-theoretic characterizations and algorithms to solve these two dual problems.

Keywords: security policies, opacity, formal verification, orwellian observation

Procedia PDF Downloads 201
762 Formal Ontology of Quality Space. Location, Subordination and Determination

Authors: Claudio Calosi, Damiano Costa, Paolo Natali

Abstract:

Determination is the relation that holds between certain kinds of properties, determinables – such as “being colored”, and others, determinates – such as “being red”. Subordination is the relation that holds between genus properties – such as “being an animal”, and others, species properties – such as “being human”'. It is widely held that Determination and Subordination share important similarities, yet also crucial differences. But what grounds such similarities and differences? This question is hardly ever addressed. The present paper provides the first step towards filling this gap in the literature. It argues that a locational theory of instantiation, roughly the view that to have a property is to occupy a location in quality space, holds the key for such an answer. More precisely, it argues that both principles of Determination and Subordination are just examples of more general principles of location. Consider Determination. The principle that everything that has a determinate has a determinable boils down to the claim that everything that has a precise location in quality space is in quality space – an eminently reasonable principle. The principle that nothing can have two determinates (at the same level of determination) boils down to the principle that nothing can be “multilocated” in quality space. In effect, the following provides a “translation table” between principles of location and determination: LOCATION DETERMINATION Functionality At Most One Determination Focus At Most One Determination & Requisite Determination* Exactness Requisite Determination* Super-Exactness Requisite Determination Exactitude Requisite Determination Converse-Exactness Determinable Inehritance This grounds the similarity between Determination and Subordination. What about the differences? The paper argues that the differences boil down to the mereological structure of the regions that are occupied in quality space, in particular whether they are simple or complex. The key technical detail is that Determination and Subordination induce a “set-theoretic rooted tree” structure over the domain of properties. Interestingly, the analysis also provides a possible justification for the Aristotelian claim that being is not a genus property – an argument that the paper develops in some detail.

Keywords: determinables/determinates, genus/species, location, Aristotle on being is not a genus

Procedia PDF Downloads 51
761 Comparative Safety Performance Evaluation of Profiled Deck Composite Slab from the Use of Slope-Intercept and Partial Shear Methods

Authors: Izian Abd. Karim, Kachalla Mohammed, Nora Farah Abd Aznieta Aziz, Law Teik Hua

Abstract:

The economic use and ease of construction of profiled deck composite slab is marred with the complex and un-economic strength verification required for the serviceability and general safety considerations. Beside these, albeit factors such as shear span length, deck geometries and mechanical frictions greatly influence the longitudinal shear strength, that determines the ultimate strength of profiled deck composite slab, and number of methods available for its determination; partial shear and slope-intercept are the two methods according to Euro-code 4 provision. However, the complexity associated with shear behavior of profiled deck composite slab, the use of these methods in determining the load carrying capacities of such slab yields different and conflicting values. This couple with the time and cost constraint associated with the strength verification is a source of concern that draws more attentions nowadays, the issue is critical. Treating some of these known shear strength influencing factors as random variables, the load carrying capacity violation of profiled deck composite slab from the use of the two-methods defined according to Euro-code 4 are determined using reliability approach, and comparatively studied. The study reveals safety values from the use of m-k method shows good standing compared with that from the partial shear method.

Keywords: composite slab, first order reliability method, longitudinal shear, partial shear connection, slope-intercept

Procedia PDF Downloads 329
760 The Double Standard: Ethical Issues and Gender Discrimination in Traditional Western Ethics

Authors: Merina Islam

Abstract:

The feminists have identified the traditional western ethical theories as basically male centered. Feminists are committed to develop a critique showing how the traditional western ethics together with traditional philosophy, irrespective of the claim for gender neutrality, all throughout remained gender-biased. This exclusion of women’s experiences from the moral discourse is justified on the ground that women cannot be moral agents, since they are not rational. By way of entailment, we are thus led to the position that virtues of traditional ethics, so viewed, can nothing but rational and hence male. The ears of traditional Western ethicists have been attuned to male rather than female ethical voices. Right from the Plato, Aristotle, Augustine, Aquinas, Rousseau, Kant, Hegel and even philosophers like Freud, Schopenhauer, Nietzsche and many others the dualism between reason-passion or mind and body started gaining prominence. These, according to them, have either intentionally excluded women or else have used certain male moral experience as the standard for all moral experiences, thereby resulting once again in exclusion of women’s experiences. Men are identified with rationality and hence contrasted with women whose sphere is believed to be that of emotion and feeling. This act of exclusion of women’s experience from moral discourse has given birth to a tradition that emphasizes reason over emotion, universal over the particular, and justice over caring. That patriarchy’s use of gender distinctions in the realm of Ethics has resulted in gender discriminations is an undeniable fact. Hence women’s moral agency is said to have often been denied, not simply by the act of exclusion of women from moral debate or sheer ignorance of their contributions, but through philosophical claims to the effect that women lack moral reason. Traditional or mainstream ethics cannot justify its claim for universality, objectivity and gender neutrality the standards from which were drawn the legitimacy of the various moral maxims or principles of it. Right from the Platonic and Aristotelian period the dualism between reason-passion or mind and body started gaining prominence. Men are identified with rationality and hence contrasted with women whose sphere is believed to be that of emotion and feeling. Through the Association of the masculine values with reason (the feminine with irrational), was created the standard prototype of moral virtues The feminists’ critique of the traditional mainstream Ethics is based on this charge that because of its inherent gender bias, in the name of gender distinctions, Ethics has so far been justifying discriminations. In this paper, attempt would make upon the gender biased-ness of traditional ethics. But Feminists are committed to develop a critique showing how the traditional ethics together with traditional philosophy, irrespective of the claim for gender neutrality, all throughout remained gender-biased. We would try to show to what extent traditional ethics is male centered and consequentially fails to justify its claims for universality and gender neutrality.

Keywords: ethics, gender, male-centered, traditional

Procedia PDF Downloads 399
759 Orbit Determination Modeling with Graphical Demonstration

Authors: Assem M. F. Sallam, Ah. El-S. Makled

Abstract:

In this paper, there is an implementation, verification, and graphical demonstration of a software application, which can be used swiftly over different preliminary orbit determination methods. A passive orbit determination method is used in this study to determine the location of a satellite or a flying body. It is named a passive orbit determination because it depends on observation without the use of any aids (radio and laser) installed on satellite. In order to understand how these methods work and how their output is accurate when compared with available verification data, the built models help in knowing the different inputs used with each method. Output from the different orbit determination methods (Gibbs, Lambert, and Gauss) will be compared with each other and verified by the data obtained from Satellite Tool Kit (STK) application. A modified model including all of the orbit determination methods using the same input will be introduced to investigate different models output (orbital parameters) for the same input (azimuth, elevation, and time). Simulation software is implemented using MATLAB. A Graphical User Interface (GUI) application named OrDet is produced using the GUI of MATLAB. It includes all the available used inputs and it outputs the current Classical Orbital Elements (COE) of satellite under observation. Produced COE are then used to propagate for a complete revolution and plotted on a 3-D view. Modified model which uses an adapter to allow same input parameters, passes these parameters to the preliminary orbit determination methods under study. Result from all orbit determination methods yield exactly the same COE output, which shows the equality of concept in determination of satellite’s location, but with different numerical methods.

Keywords: orbit determination, STK, Matlab-GUI, satellite tracking

Procedia PDF Downloads 241
758 Epistemology in African Philosophy: A Critique of African Concept of Knowledge

Authors: Ovett Nwosimiri

Abstract:

African tradition and what it entails are the content of African concepts of knowledge. The study of African concepts of knowledge is also known as African epistemology. In other words, African epistemology is a branch of African philosophy that deals with knowledge. This branch of African philosophy engages with the nature and concept of knowledge, the ways in which knowledge can be gained, the ways in which one can justify an epistemic claim or validate a knowledge claim and the limit of human knowledge, etc. The protagonists of African epistemology based their argument for a distinctive or unique African epistemology on the premise or proposition “that each race is endowed with a distinctive nature and embodies in its civilization a particular spirit”. All human beings share some certain basic values and perceptions irrespective of where you come from, and this idea actually fosters some forms of interaction between people from different nationality. Africans like other people share in some certain values, perceptions, and interaction with the rest of the world. These basic values, perceptions, and interaction that Africans share with the rest of the word prompted African people to attempt to “modernize” their societies or develop some forms of their tradition in harmony with the ethos of the contemporary world. Based on the above ideas, it would be interesting to investigate if such (African) epistemology is still unique. The advocates of African epistemology focus on the externalist notion of justification and neglect the idea that both the internalist and externalist notion of justification are needed in order to arrive at a coherent and well-founded account of epistemic justification. Thus, this paper will critically examine the claims that there is a unique African epistemology (a mode of knowing that is peculiar to Africans, and that African mode of knowing is social, monism and situated notion of knowledge), and the grounds for justifying beliefs and epistemic claims.

Keywords: internalist, externalist, knowledge, justification

Procedia PDF Downloads 234
757 Flexural Strengthening of Steel Beams Using Fiber Reinforced Polymers

Authors: Sally Hosny, Mona G. Ibrahim, N. K. Hassan

Abstract:

Fiber reinforced polymers (FRP) is one of the most environmentally method for strengthening and retrofitting steel structure buildings. The behaviour of flexural strengthened steel I-beams using FRP was investigated. The finite element (FE) models were developed using ANSYS® as verification cases to simulate the experimental behaviour of using FRP strips to flexure strengthen steel I-beam. Two experimental studies were selected for verification; first examined the effect of different thicknesses and modulus of elasticity while the second studied the effect of applying different carbon fiber reinforced polymers (CFRP) bond lengths. The proposed FE models were in good agreement with the experimental results in terms of failure modes, load bearing capacities and strain distribution on CFRP strips. The verified FE models can be utilized to conduct a parametric study where various widths (40, 50, 60, 70 and 80 mm), thickness (1.2, 2 and 4 mm) and lengths (1500, 1700 and 1800 mm) of CFRP were analyzed. The results presented clearly revealed that the load bearing capacity was significantly increased (+7%) when the width and thickness were increased. However, load bearing capacity was slightly affected using longer CFRP strips. Moreover, applying another glass fiber reinforced polymers (GFRP) of 1500 mm in length, 50 mm in width and thicknesses of 1.2, 2 and 4 mm were investigated. Load bearing capacity of strengthened I-beams using GFRP is less than CFRP by average 8%. Statistical analysis has been conducted using Minitab®.

Keywords: FRP, strengthened steel I-beams, flexural, FEM, ANSYS

Procedia PDF Downloads 245
756 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models

Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue

Abstract:

Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.

Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation

Procedia PDF Downloads 229
755 Students' Errors in Translating Algebra Word Problems to Mathematical Structure

Authors: Ledeza Jordan Babiano

Abstract:

Translating statements into mathematical notations is one of the processes in word problem-solving. However, based on the literature, students still have difficulties with this skill. The purpose of this study was to investigate the translation errors of the students when they translate algebraic word problems into mathematical structures and locate the errors via the lens of the Translation-Verification Model. Moreover, this qualitative research study employed content analysis. During the data-gathering process, the students were asked to answer a six-item algebra word problem questionnaire, and their answers were analyzed by experts through blind coding using the Translation-Verification Model to determine their translation errors. After this, a focus group discussion was conducted, and the data gathered was analyzed through thematic analysis to determine the causes of the students’ translation errors. It was found out that students’ prevalent error in translation was the interpretation error, which was situated in the Attribute construct. The emerging themes during the FGD were: (1) The procedure of translation is strategically incorrect; (2) Lack of comprehension; (3) Algebra concepts related to difficulty; (4) Lack of spatial skills; (5) Unprepared for independent learning; and (6) The content of the problem is developmentally inappropriate. These themes boiled down to the major concept of independent learning preparedness in solving mathematical problems. This concept has subcomponents, which include contextual and conceptual factors in translation. Consequently, the results provided implications for instructors and professors in Mathematics to innovate their teaching pedagogies and strategies to address translation gaps among students.

Keywords: mathematical structure, algebra word problems, translation, errors

Procedia PDF Downloads 26
754 In silico Analysis of a Causative Mutation in Cadherin-23 Gene Identified in an Omani Family with Hearing Loss

Authors: Mohammed N. Al Kindi, Mazin Al Khabouri, Khalsa Al Lamki, Tommasso Pappuci, Giovani Romeo, Nadia Al Wardy

Abstract:

Hereditary hearing loss is a heterogeneous group of complex disorders with an overall incidence of one in every five hundred newborns presented as syndromic and non-syndromic forms. Cadherin-related 23 (CDH23) is one of the listed deafness causative genes. CDH23 is found to be expressed in the stereocilia of hair cells and the retina photoreceptor cells. Defective CDH23 has been associated mostly with prelingual severe-to-profound sensorineural hearing loss (SNHL) in either syndromic (USH1D) or non-syndromic SNHL (DFNB12). An Omani family diagnosed clinically with severe-profound sensorineural hearing loss was genetically analysed by whole exome sequencing technique. A novel homozygous missense variant, c.A7451C (p.D2484A), in exon 53 of CDH23 was detected. One hundred and thirty control samples were analysed where all were negative for the detected variant. The variant was analysed in silico for pathogenicity verification using several mutation prediction software. The variant proved to be a pathogenic mutation and is reported for the first time in Oman and worldwide. It is concluded that in silico mutation prediction analysis might be used as a useful molecular diagnostics tool benefiting both genetic counseling and mutation verification. The aspartic acid 2484 alanine missense substitution might be the main disease-causing mutation that damages CDH23 function and could be used as a genetic hearing loss marker for this particular Omani family.

Keywords: Cdh23, d2484a, in silico, Oman

Procedia PDF Downloads 188
753 Different Data-Driven Bivariate Statistical Approaches to Landslide Susceptibility Mapping (Uzundere, Erzurum, Turkey)

Authors: Azimollah Aleshzadeh, Enver Vural Yavuz

Abstract:

The main goal of this study is to produce landslide susceptibility maps using different data-driven bivariate statistical approaches; namely, entropy weight method (EWM), evidence belief function (EBF), and information content model (ICM), at Uzundere county, Erzurum province, in the north-eastern part of Turkey. Past landslide occurrences were identified and mapped from an interpretation of high-resolution satellite images, and earlier reports as well as by carrying out field surveys. In total, 42 landslide incidence polygons were mapped using ArcGIS 10.4.1 software and randomly split into a construction dataset 70 % (30 landslide incidences) for building the EWM, EBF, and ICM models and the remaining 30 % (12 landslides incidences) were used for verification purposes. Twelve layers of landslide-predisposing parameters were prepared, including total surface radiation, maximum relief, soil groups, standard curvature, distance to stream/river sites, distance to the road network, surface roughness, land use pattern, engineering geological rock group, topographical elevation, the orientation of slope, and terrain slope gradient. The relationships between the landslide-predisposing parameters and the landslide inventory map were determined using different statistical models (EWM, EBF, and ICM). The model results were validated with landslide incidences, which were not used during the model construction. In addition, receiver operating characteristic curves were applied, and the area under the curve (AUC) was determined for the different susceptibility maps using the success (construction data) and prediction (verification data) rate curves. The results revealed that the AUC for success rates are 0.7055, 0.7221, and 0.7368, while the prediction rates are 0.6811, 0.6997, and 0.7105 for EWM, EBF, and ICM models, respectively. Consequently, landslide susceptibility maps were classified into five susceptibility classes, including very low, low, moderate, high, and very high. Additionally, the portion of construction and verification landslides incidences in high and very high landslide susceptibility classes in each map was determined. The results showed that the EWM, EBF, and ICM models produced satisfactory accuracy. The obtained landslide susceptibility maps may be useful for future natural hazard mitigation studies and planning purposes for environmental protection.

Keywords: entropy weight method, evidence belief function, information content model, landslide susceptibility mapping

Procedia PDF Downloads 105
752 Forensic Methods Used for the Verification of the Authenticity of Prints

Authors: Olivia Rybak-Karkosz

Abstract:

This paper aims to present the results of scientific research on methods of forging art prints and their elements, such as signature or provenance and forensic science methods that might be used to verify their authenticity. In the last decades, the art market has observed significant interest in purchasing prints. They are considered an economical alternative to paintings and a considerable investment. However, the authenticity of an art print is difficult to establish as similar visual effects might be achieved with drawings or xerox. The latter is easy to make using a home printer. They are then offered on flea markets or internet auctions as genuine prints. This probable ease of forgery and, at the same time, the difficulty of distinguishing art print techniques were the main reasons why this research was undertaken. A lack of scientific methods dedicated to disclosing a forgery encouraged the author to verify the possibility of using forensic science's methods known and used in other fields of expertise. This research methodology consisted of completing representative forgery samples collected in selected museums based in Poland and a few in Germany and Austria. That allowed the author to present a typology of methods used to forge art prints. Given that one of the most famous graphic design examples is bills and securities, it seems only appropriate to propose in print verification the usage of methods of detecting counterfeit currency. These methods contain an examination of ink, paper, and watermarks. On prints, additionally, signatures and imprints of stamps, etc., are forged as well. So the examination should be completed with handwriting examination and forensic sphragistics. The paper contains a stipulation to conduct a complex analysis of authenticity with the participation of an art restorer, art historian, and forensic expert as head of this team.

Keywords: art forgery, examination of an artwork, handwriting analysis, prints

Procedia PDF Downloads 99
751 The application of Gel Dosimeters and Comparison with other Dosimeters in Radiotherapy: A Literature Review

Authors: Sujan Mahamud

Abstract:

Purpose: A major challenge in radiotherapy treatment is to deliver precise dose of radiation to the tumor with minimum dose to the healthy normal tissues. Recently, gel dosimetry has emerged as a powerful tool to measure three-dimensional (3D) dose distribution for complex delivery verification and quality assurance. These dosimeters act both as a phantom and detector, thus confirming the versatility of dosimetry technique. The aim of the study is to know the application of Gel Dosimeters in Radiotherapy and find out the comparison with 1D and 2D dimensional dosimeters. Methods and Materials: The study is carried out from Gel Dosimeter literatures. Secondary data and images have been collected from different sources such as different guidelines, books, and internet, etc. Result: Analyzing, verifying, and comparing data from treatment planning system (TPS) is determined that gel dosimeter is a very excellent powerful tool to measure three-dimensional (3D) dose distribution. The TPS calculated data were in very good agreement with the dose distribution measured by the ferrous gel. The overall uncertainty in the ferrous-gel dose determination was considerably reduced using an optimized MRI acquisition protocol and a new MRI scanner. The method developed for comparing measuring gel data with calculated treatment plans, the gel dosimetry method, was proven to be a useful for radiation treatment planning verification. In 1D and 2D Film, the depth dose and lateral for RMSD are 1.8% and 2%, and max (Di-Dj) are 2.5% and 8%. Other side 2D+ ( 3D) Film Gel and Plan Gel for RMSDstruct and RMSDstoch are 2.3% & 3.6% and 1% & 1% and system deviation are -0.6% and 2.5%. The study is investigated that the result fined 2D+ (3D) Film Dosimeter is better than the 1D and 2D Dosimeter. Discussion: Gel Dosimeters is quality control and quality assurance tool which will used the future clinical application.

Keywords: gel dosimeters, phantom, rmsd, QC, detector

Procedia PDF Downloads 127
750 Responsibility to Protect in Practice: Libya and Syria

Authors: Guram Esakia, Giorgi Goguadze

Abstract:

The following paper is written due to overview the concept of R2P, this new dimension in International Relations field. Paper contains the general description of previously mentioned concept, its advantages and disadvantages. We also compare each other R2P and“humanitarian intervention“, trying to make clear division between these two approaches in conflict solution. There is also discussed R2P in real action, successful one in Libya and yet failed in Syria. Essay doesn’t claim to be the part of scientific chain and is based only on personal subjection as well on information gathered from various scholars and UN resolutions.

Keywords: the concept of R2P, humanitarian intervention, Libya, Syria

Procedia PDF Downloads 256
749 Establishment of a Classifier Model for Early Prediction of Acute Delirium in Adult Intensive Care Unit Using Machine Learning

Authors: Pei Yi Lin

Abstract:

Objective: The objective of this study is to use machine learning methods to build an early prediction classifier model for acute delirium to improve the quality of medical care for intensive care patients. Background: Delirium is a common acute and sudden disturbance of consciousness in critically ill patients. After the occurrence, it is easy to prolong the length of hospital stay and increase medical costs and mortality. In 2021, the incidence of delirium in the intensive care unit of internal medicine was as high as 59.78%, which indirectly prolonged the average length of hospital stay by 8.28 days, and the mortality rate is about 2.22% in the past three years. Therefore, it is expected to build a delirium prediction classifier through big data analysis and machine learning methods to detect delirium early. Method: This study is a retrospective study, using the artificial intelligence big data database to extract the characteristic factors related to delirium in intensive care unit patients and let the machine learn. The study included patients aged over 20 years old who were admitted to the intensive care unit between May 1, 2022, and December 31, 2022, excluding GCS assessment <4 points, admission to ICU for less than 24 hours, and CAM-ICU evaluation. The CAMICU delirium assessment results every 8 hours within 30 days of hospitalization are regarded as an event, and the cumulative data from ICU admission to the prediction time point are extracted to predict the possibility of delirium occurring in the next 8 hours, and collect a total of 63,754 research case data, extract 12 feature selections to train the model, including age, sex, average ICU stay hours, visual and auditory abnormalities, RASS assessment score, APACHE-II Score score, number of invasive catheters indwelling, restraint and sedative and hypnotic drugs. Through feature data cleaning, processing and KNN interpolation method supplementation, a total of 54595 research case events were extracted to provide machine learning model analysis, using the research events from May 01 to November 30, 2022, as the model training data, 80% of which is the training set for model training, and 20% for the internal verification of the verification set, and then from December 01 to December 2022 The CU research event on the 31st is an external verification set data, and finally the model inference and performance evaluation are performed, and then the model has trained again by adjusting the model parameters. Results: In this study, XG Boost, Random Forest, Logistic Regression, and Decision Tree were used to analyze and compare four machine learning models. The average accuracy rate of internal verification was highest in Random Forest (AUC=0.86), and the average accuracy rate of external verification was in Random Forest and XG Boost was the highest, AUC was 0.86, and the average accuracy of cross-validation was the highest in Random Forest (ACC=0.77). Conclusion: Clinically, medical staff usually conduct CAM-ICU assessments at the bedside of critically ill patients in clinical practice, but there is a lack of machine learning classification methods to assist ICU patients in real-time assessment, resulting in the inability to provide more objective and continuous monitoring data to assist Clinical staff can more accurately identify and predict the occurrence of delirium in patients. It is hoped that the development and construction of predictive models through machine learning can predict delirium early and immediately, make clinical decisions at the best time, and cooperate with PADIS delirium care measures to provide individualized non-drug interventional care measures to maintain patient safety, and then Improve the quality of care.

Keywords: critically ill patients, machine learning methods, delirium prediction, classifier model

Procedia PDF Downloads 38
748 Habermas: A Unity of the Law and Democracy

Authors: Qi Jing

Abstract:

This paper examines and defends Jürgen Habermas’s claim that law is the other side of democracy. It is believed that law and democracy are related, for Habermas, through the mediation of communicative rationality and discourse ethics. These ground a procedural conception of democracy, which legitimizes and rationalizes legal codes in a robust public sphere, linking the exercise of democratic political power to the form of law. The strengths of Habermas’s approach lie, it should be claimed, in its overcoming of relativism, its combination of democratically-enacted law with post-conventional morality, and its correction of the one-sided emphasis on private and public autonomy in Kant and Rousseau, respectively.

Keywords: habermas, law, democracy, reason, public sphere

Procedia PDF Downloads 45
747 The Validation of RadCalc for Clinical Use: An Independent Monitor Unit Verification Software

Authors: Junior Akunzi

Abstract:

In the matter of patient treatment planning quality assurance in 3D conformational therapy (3D-CRT) and volumetric arc therapy (VMAT or RapidArc), the independent monitor unit verification calculation (MUVC) is an indispensable part of the process. Concerning 3D-CRT treatment planning, the MUVC can be performed manually applying the standard ESTRO formalism. However, due to the complex shape and the amount of beams in advanced treatment planning technic such as RapidArc, the manual independent MUVC is inadequate. Therefore, commercially available software such as RadCalc can be used to perform the MUVC in complex treatment planning been. Indeed, RadCalc (version 6.3 LifeLine Inc.) uses a simplified Clarkson algorithm to compute the dose contribution for individual RapidArc fields to the isocenter. The purpose of this project is the validation of RadCalc in 3D-CRT and RapidArc for treatment planning dosimetry quality assurance at Antoine Lacassagne center (Nice, France). Firstly, the interfaces between RadCalc and our treatment planning systems (TPS) Isogray (version 4.2) and Eclipse (version13.6) were checked for data transfer accuracy. Secondly, we created test plans in both Isogray and Eclipse featuring open fields, wedges fields, and irregular MLC fields. These test plans were transferred from TPSs according to the radiotherapy protocol of DICOM RT to RadCalc and the linac via Mosaiq (version 2.5). Measurements were performed in water phantom using a PTW cylindrical semiflex ionisation chamber (0.3 cm³, 31010) and compared with the TPSs and RadCalc calculation. Finally, 30 3D-CRT plans and 40 RapidArc plans created with patients CT scan were recalculated using the CT scan of a solid PMMA water equivalent phantom for 3D-CRT and the Octavius II phantom (PTW) CT scan for RapidArc. Next, we measure the doses delivered into these phantoms for each plan with a 0.3 cm³ PTW 31010 cylindrical semiflex ionisation chamber (3D-CRT) and 0.015 cm³ PTW PinPoint ionisation chamber (Rapidarc). For our test plans, good agreements were found between calculation (RadCalc and TPSs) and measurement (mean: 1.3%; standard deviation: ± 0.8%). Regarding the patient plans, the measured doses were compared to the calculation in RadCalc and in our TPSs. Moreover, RadCalc calculations were compared to Isogray and Eclispse ones. Agreements better than (2.8%; ± 1.2%) were found between RadCalc and TPSs. As for the comparison between calculation and measurement the agreement for all of our plans was better than (2.3%; ± 1.1%). The independent MU verification calculation software RadCal has been validated for clinical use and for both 3D-CRT and RapidArc techniques. The perspective of this project includes the validation of RadCal for the Tomotherapy machine installed at centre Antoine Lacassagne.

Keywords: 3D conformational radiotherapy, intensity modulated radiotherapy, monitor unit calculation, dosimetry quality assurance

Procedia PDF Downloads 188
746 Diagnostics of Existing Steel Structures of Winter Sport Halls

Authors: Marcela Karmazínová, Jindrich Melcher, Lubomír Vítek, Petr Cikrle

Abstract:

The paper deals with the diagnostics of steel roof structure of the winter sports stadiums built in 1970 year. The necessity of the diagnostics has been given by the requirement to the evaluation design of this structure, which has been caused by the new situation in the field of the loadings given by the validity of the European Standards in the Czech Republic from 2010 year. Due to these changes in the normative rules, in practice, existing structures are gradually subjected to the evaluation design and depending on its results to the strengthening or reconstruction, respectively. The steel roof is composed of plane truss main girders, purlins and bracings and the roof structure is supported by two arch main girders with the span of L=84 m. The in situ diagnostics of the roof structure was oriented to the following parts: (i) determination and evaluation of the actual material properties of used steel and (ii) verification of the actual dimensions of the structural members. For the solution, the non-destructive methods have been used for in situ measurement. For the indicative determination of steel strengths the modified method based on the determination of Rockwell’s hardness has been used. For the verification of the member’s dimensions (thickness of hollow sections) the ultrasound method has been used. This paper presents the results obtained using these testing methods and their evaluation, from the viewpoint of the usage for the subsequent static assessment and design evaluation of the existing structure. For the comparison, the examples of the similar evaluations realized for steel structures of the stadiums in Olomouc and Jihlava cities are briefly illustrated, too.

Keywords: actual dimensions, destructive methods, diagnostics, existing steel structure, indirect non-destructive methods, Rockwel’s hardness, sport hall, steel strength, ultrasound method.

Procedia PDF Downloads 316
745 A Review of Accuracy Optical Surface Imaging Systems for Setup Verification During Breast Radiotherapy Treatment

Authors: Auwal Abubakar, Ahmed Ahidjo, Shazril Imran Shaukat, Noor Khairiah A. Karim, Gokula Kumar Appalanaido, Hafiz Mohd Zin

Abstract:

Background: The use of optical surface imaging systems (OSISs) is increasingly becoming popular in radiotherapy practice, especially during breast cancer treatment. This study reviews the accuracy of the available commercial OSISs for breast radiotherapy. Method: A literature search was conducted and identified the available commercial OSISs from different manufacturers that are integrated into radiotherapy practice for setup verification during breast radiotherapy. Studies that evaluated the accuracy of the OSISs during breast radiotherapy using cone beam computed tomography (CBCT) as a reference were retrieved and analyzed. The physics and working principles of the systems from each manufacturer were discussed together with their respective strength and limitations. Results: A total of five (5) different commercially available OSISs from four (4) manufacturers were identified, each with a different working principle. Six (6) studies were found to evaluate the accuracy of the systems during breast radiotherapy in conjunction with CBCT as a goal standard. The studies revealed that the accuracy of the system in terms of mean difference ranges from 0.1 to 2.1 mm. The correlation between CBCT and OSIS ranges between 0.4 and 0.9. The limit of agreements obtained using bland Altman analysis in the studies was also within an acceptable range. Conclusion: The OSISs have an acceptable level of accuracy and could be used safely during breast radiotherapy. The systems are non-invasive, ionizing radiation-free, and provide real-time imaging of the target surface at no extra concomitant imaging dose. However, the system should only be used to complement rather than replace x-ray-based image guidance techniques such as CBCT.

Keywords: optical surface imaging system, Cone beam computed tomography (CBCT), surface guided radiotherapy, Breast radiotherapy

Procedia PDF Downloads 30
744 Real-World Comparison of Adherence to and Persistence with Dulaglutide and Liraglutide in UAE e-Claims Database

Authors: Ibrahim Turfanda, Soniya Rai, Karan Vadher

Abstract:

Objectives— The study aims to compare real-world adherence to and persistence with dulaglutide and liraglutide in patients with type 2 diabetes (T2D) initiating treatment in UAE. Methods— This was a retrospective, non-interventional study (observation period: 01 March 2017–31 August 2019) using the UAE Dubai e-Claims database. Included: adult patients initiating dulaglutide/liraglutide 01 September 2017–31 August 2018 (index period) with: ≥1 claim for T2D in the 6 months before index date (ID); ≥1 claim for dulaglutide/liraglutide during index period; and continuous medical enrolment for ≥6 months before and ≥12 months after ID. Key endpoints, assessed 3/6/12 months after ID: adherence to treatment (proportion of days covered [PDC; PDC ≥80% considered ‘adherent’], per-group mean±standard deviation [SD] PDC); and persistence (number of continuous therapy days from ID until discontinuation [i.e., >45 days gap] or end of observation period). Patients initiating dulaglutide/liraglutide were propensity score matched (1:1) based on baseline characteristics. Between-group comparison of adherence was analysed using the McNemar test (α=0.025). Persistence was analysed using Kaplan–Meier estimates with log-rank tests (α=0.025) for between-group comparisons. This study presents 12-month outcomes. Results— Following propensity score matching, 263 patients were included in each group. Mean±SD PDC for all patients at 12 months was significantly higher in the dulaglutide versus the liraglutide group (dulaglutide=0.48±0.30, liraglutide=0.39±0.28, p=0.0002). The proportion of adherent patients favored dulaglutide (dulaglutide=20.2%, liraglutide=12.9%, p=0.0302), as did the probability of being adherent to treatment (odds ratio [97.5% CI]: 1.70 [0.99, 2.91]; p=0.03). Proportion of persistent patients also favoured dulaglutide (dulaglutide=15.2%, liraglutide=9.1%, p=0.0528), as did the probability of discontinuing treatment 12 months after ID (p=0.027). Conclusions— Based on the UAE Dubai e-Claims database data, dulaglutide initiators exhibited significantly greater adherence in terms of mean PDC versus liraglutide initiators. The proportion of adherent patients and the probability of being adherent favored the dulaglutide group, as did treatment persistence.

Keywords: adherence, dulaglutide, effectiveness, liraglutide, persistence

Procedia PDF Downloads 86
743 Optimisation of Metrological Inspection of a Developmental Aeroengine Disc

Authors: Suneel Kumar, Nanda Kumar J. Sreelal Sreedhar, Suchibrata Sen, V. Muralidharan,

Abstract:

Fan technology is very critical and crucial for any aero engine technology. The fan disc forms a critical part of the fan module. It is an airworthiness requirement to have a metrological qualified quality disc. The current study uses a tactile probing and scanning on an articulated measuring machine (AMM), a bridge type coordinate measuring machine (CMM) and Metrology software for intermediate and final dimensional and geometrical verification during the prototype development of the disc manufactured through forging and machining process. The circumferential dovetails manufactured through the milling process are evaluated based on the evaluated and analysed metrological process. To perform metrological optimization a change of philosophy is needed making quality measurements available as fast as possible to improve process knowledge and accelerate the process but with accuracy, precise and traceable measurements. The offline CMM programming for inspection and optimisation of the CMM inspection plan are crucial portions of the study and discussed. The dimensional measurement plan as per the ASME B 89.7.2 standard to reach an optimised CMM measurement plan and strategy are an important requirement. The probing strategy, stylus configuration, and approximation strategy effects on the measurements of circumferential dovetail measurements of the developmental prototype disc are discussed. The results were discussed in the form of enhancement of the R &R (repeatability and reproducibility) values with uncertainty levels within the desired limits. The findings from the measurement strategy adopted for disc dovetail evaluation and inspection time optimisation are discussed with the help of various analyses and graphical outputs obtained from the verification process.

Keywords: coordinate measuring machine, CMM, aero engine, articulated measuring machine, fan disc

Procedia PDF Downloads 85
742 Design of Reinforced Concrete with Eurocode 2

Authors: Carla Maria Costa Ferreira, Maria Helena Freitas Melao Barros

Abstract:

The rules implemented in Europe regarding structural design are termed Structural Eurocodes and deal with the several materials available for construction. Particularly regarding the very used in Europe concrete with steel reinforcement, it is named the Eurocode 2 – Design of Concrete Structures, usually known as EC2. The need of tables and abacuses to help in the design of reinforced concrete was due to the fact that the evolution and the study of new procedures and higher strength concrete showed that the previous tables needed to be improved. Reinforced concrete structures have particular aspects in the design that come from the nonlinear behavior of the concrete and steel and, in the case of concrete, also by the very low tensile strength. The design of reinforced concrete structures is made in terms of evaluating the ultimate strength and how it behaves under service conditions. As a matter of fact, the use of higher-strength concrete and steel classes showed that these serviceability design that was important for prestressed structures may be relevant in reinforced concrete structures. For these aspects, there are tables and design charts used for the ultimate limit design of reinforced concrete sections under bending moments and axial forces, and also auxiliary design diagrams able to evaluate the stress of the steel and the concrete at a section and the ductility for service limit states verification. For practical use, here are presented tables and design charts for the ultimate limit design of reinforced concrete sections and also auxiliary interaction diagrams for verification of the serviceability conditions. These kinds of aid for design were only available to engineers before the development of computers and, nowadays, yet an important tool in the universities for the students' use. Usually, in the reinforced concrete design, it is needed to obtain the area of the steel longitudinal reinforcement to be placed in the structure. The quantity and the position of the steel area may have different solutions and these tables and abacuses permit to obtain many possibilities in order to optimize the solution in economic or ductility terms.

Keywords: design examples, Eurocode 2, reinforced concrete, section design

Procedia PDF Downloads 39
741 Authentication and Legal Admissibility of 'Computer Evidence from Electronic Voting Machines' in Electoral Litigation: A Qualitative Legal Analysis of Judicial Opinions of Appellate Courts in the USA

Authors: Felix O. Omosele

Abstract:

Several studies have established that electronic voting machines are prone to multi-faceted challenges. One of which is their capacity to lose votes after the ballots might have been cast. Therefore, the international consensus appears to favour the use of electronic voting machines that are accompanied with verifiable audit paper audit trail (VVPAT). At present, there is no known study that has evaluated the impacts (or otherwise) of this verification and auditing on the authentication, admissibility and evidential weight of electronically-obtained electoral data. This legal inquiry is important as elections are sometimes won or lost in courts and on the basis of such data. This gap will be filled by the present research work. Using the United States of America as a case study, this paper employed a qualitative legal analysis of several of its appellate courts’ judicial opinions. This analysis equally unearths the necessary statutory rules and regulations that are important to the research problem. The objective of the research is to highlight the roles played by VVPAT on electoral evidence- as seen from the eyes of the court. The preliminary outcome of this qualitative analysis shows that the admissibility and weight attached to ‘Computer Evidence from e-voting machines (CEEM)’ are often treated with general standards applied to other computer-stored evidence. These standards sometimes fail to embrace the peculiar challenges faced by CEEM, particularly with respect to their tabulation and transmission. This paper, therefore, argues that CEEM should be accorded unique consideration by courts. It proposes the development of a legal standard which recognises verification and auditing as ‘weight enhancers’ for electronically-obtained electoral data.

Keywords: admissibility of computer evidence, electronic voting, qualitative legal analysis, voting machines in the USA

Procedia PDF Downloads 168
740 Verification of Low-Dose Diagnostic X-Ray as a Tool for Relating Vital Internal Organ Structures to External Body Armour Coverage

Authors: Natalie A. Sterk, Bernard van Vuuren, Petrie Marais, Bongani Mthombeni

Abstract:

Injuries to the internal structures of the thorax and abdomen remain a leading cause of death among soldiers. Body armour is a standard issue piece of military equipment designed to protect the vital organs against ballistic and stab threats. When configured for maximum protection, the excessive weight and size of the armour may limit soldier mobility and increase physical fatigue and discomfort. Providing soldiers with more armour than necessary may, therefore, hinder their ability to react rapidly in life-threatening situations. The capability to determine the optimal trade-off between the amount of essential anatomical coverage and hindrance on soldier performance may significantly enhance the design of armour systems. The current study aimed to develop and pilot a methodology for relating internal anatomical structures with actual armour plate coverage in real-time using low-dose diagnostic X-ray scanning. Several pilot scanning sessions were held at Lodox Systems (Pty) Ltd head-office in South Africa. Testing involved using the Lodox eXero-dr to scan dummy trunk rigs at various degrees and heights of measurement; as well as human participants, wearing correctly fitted body armour while positioned in supine, prone shooting, seated and kneeling shooting postures. The verification of sizing and metrics obtained from the Lodox eXero-dr were then confirmed through a verification board with known dimensions. Results indicated that the low-dose diagnostic X-ray has the capability to clearly identify the vital internal structures of the aortic arch, heart, and lungs in relation to the position of the external armour plates. Further testing is still required in order to fully and accurately identify the inferior liver boundary, inferior vena cava, and spleen. The scans produced in the supine, prone, and seated postures provided superior image quality over the kneeling posture. The X-ray-source and-detector distance from the object must be standardised to control for possible magnification changes and for comparison purposes. To account for this, specific scanning heights and angles were identified to allow for parallel scanning of relevant areas. The low-dose diagnostic X-ray provides a non-invasive, safe, and rapid technique for relating vital internal structures with external structures. This capability can be used for the re-evaluation of anatomical coverage required for essential protection while optimising armour design and fit for soldier performance.

Keywords: body armour, low-dose diagnostic X-ray, scanning, vital organ coverage

Procedia PDF Downloads 97