Search results for: cross-border insolvency provisions in the 2016 code
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3157

Search results for: cross-border insolvency provisions in the 2016 code

2647 Tuberculosis in Patients with HIV-Infection in Russia: Cohort Study over the Period of 2015-2016 Years

Authors: Marina Nosik, Irina Rymanova, Konstantin Ryzhov, Joan Yarovaya, Alexander Sobkin

Abstract:

Tuberculosis (TB) associated with HIV is one of the top causes of death worldwide. However, early detection and treatment of TB in HIV-infected individuals significantly reduces the risk of developing severe forms of TB and mortality. The goal of the study was to analyze the peculiarities of TB associated with HIV infection. Over the period of 2015-2016 a retrospective cohort study was conducted among 377 patients with TB/HIV co-infection who attended the Moscow Tuberculosis Clinic. The majority of the patients was male (64,5%). The median age was: men 37,9 (24÷62) and women 35,4 (22÷72) years. The most prevalent age group was 30-39 years both for men and women (73,3% and 54,7%, respectively). The ratio of patients in age group 50-59 and senior was 3,9%. Socioeconomic status of patients was rather low: only 2.3% of patients had a university degree; 76,1% was unemployed (of whom 21,7% were disabled). Most patients had disseminated pulmonary tuberculosis in the phase of infiltration/ decay (41,5%). The infiltrative TB was detected in 18,9% of patients; 20,1% patients had tuberculosis of intrathoracic lymph nodes. The occurrence of MDR-TB was 16,8% and XDR-TB – 17,9%. The number of HIV-positive patients with newly diagnosed TB was n=261(69,2%). The active TB-form (MbT+) among new TB/HIV cases was 44,7 %. The severe clinical forms of TB and a high TB incidence rate among HIV-infected individuals alongside with a large number of cases of newly diagnosed tuberculosis, indicate the need for more intense interaction with TB services for timely diagnosis of TB which will optimize treatment outcomes.

Keywords: HIV, tuberculosis (TB), TB associated with HIV, multidrug-resistant TB (MDR-TB)

Procedia PDF Downloads 243
2646 Description and Evaluation of the Epidemiological Surveillance System for Meningitis in the Province of Taza Between 2016 and 2020

Authors: Bennasser Samira

Abstract:

Meningitis, especially the meningococcal one, is a serious problem of public health. A system of vigilanceand surveillance is in place to allow effective actions to be taken on actual or potential health problems caused by all forms of meningitis. Objectives: 1. Describe the epidemiological surveillance system for meningitis in the province of Taza. 2. Evaluate the quality and responsiveness of the epidemiological surveillance system for meningitis in the province of Taza. 3. Propose measures to improve this system at the provincial level. Methods: This was a descriptive study with a purely quantitative approach by evaluating the quality and responsiveness of the system during 5 years between January 2016 and December 2020. We usedfor that the investigation files of meningitis cases and the provincial database of meningitis. We calculated some quality indicators of surveillance system already defined by the National Program for the Prevention and Control of Meningitis. Results: The notification is passive, the completeness of the data is quite good (94%), and the timeliness don’t exceed 71%. The quality of the data is acceptable (91% agreement). The systematic and rapid performance of lumbar punctures increases the diagnostic capabilities of the system. The local response actions are effected in 100%. Conclusion: The improvement of this surveillance system depends on strengthening the staff skills in diagnostic, reviewing surveillance tools, and encouraging judicious use of the data.

Keywords: evaluation, meningitis, system, taza, morocco

Procedia PDF Downloads 160
2645 Churn Prediction for Savings Bank Customers: A Machine Learning Approach

Authors: Prashant Verma

Abstract:

Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.

Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling

Procedia PDF Downloads 143
2644 Euthanasia as a Case of Judicial Entrepreneurship in India: Analyzing the Role of the Supreme Court in the Policy Process of Euthanasia

Authors: Aishwarya Pothula

Abstract:

Euthanasia in India is a politically dormant policy issue in the sense that discussions around it are sporadic in nature (usually with developments in specific cases) and it stays as a dominant issue in the public domain for a fleeting period. In other words, it is a non-political issue that has been unable to successfully get on the policy agenda. This paper studies how the Supreme Court of India (SC) plays a role in euthanasia’s policy making. In 2011, the SC independently put a law in place that legalized passive euthanasia through its judgement in the Aruna Shanbaug v. Union of India case. According to this, it is no longer illegal to withhold/withdraw a patient’s medical treatment in certain cases. This judgement, therefore, is the empirical focus of this paper. The paper essentially employs two techniques of discourse analysis to study the SC’s system of argumentation. The two methods, Text Analysis using Gasper’s Analysis Table and Frame Analysis – are complemented by two discourse techniques called metaphor analysis and lexical analysis. The framework within which the analysis is conducted lies in 1) the judicial process of India, i.e. the SC procedures and the Constitutional rules and provisions, and 2) John W. Kingdon’s theory of policy windows and policy entrepreneurs. The results of this paper are three-fold: first, the SC dismiss the petitioner’s request for passive euthanasia on inadequate and weak grounds, thereby setting no precedent for the historic law they put in place. In other words, they leave the decision open for the Parliament to act upon. Hence the judgement, as opposed to arguments by many, is by no means an instance of judicial activism/overreach. Second, they define euthanasia in a way that resonates with existing broader societal themes. They combine this with a remarkable use of authoritative and protective tones/stances to settle at an intermediate position that balances the possible opposition to their role in the process and what they (perhaps) perceive to be an optimal solution. Third, they soften up the policy community (including the public) to the idea of passive euthanasia leading it towards a Parliamentarian legislation. They achieve this by shaping prevalent principles, provisions and worldviews through an astute use of the legal instruments at their disposal. This paper refers to this unconventional role of the SC as ‘judicial entrepreneurship’ which is also the first scholarly contribution towards research on euthanasia as a policy issue in India.

Keywords: argumentation analysis, Aruna Ramachandra Shanbaug, discourse analysis, euthanasia, judicial entrepreneurship, policy-making process, supreme court of India

Procedia PDF Downloads 266
2643 Managing Gender Based Violence in Nigeria: A Legal Conundrum

Authors: Foluke Dada

Abstract:

The Prevalence of gender-based violence in Nigeria is of such concern and magnitude that the government has intervened by ratifying international instruments such as the convention on the elimination of all forms of discrimination against women, the declaration on the elimination of violence against women; the protocol to the African charter on human and people’s rights on the rights of women, etc. By promulgating domestic laws that sought to prevent the perpetration of Gender-based violence and also protect victims from future occurrences. Nigeria principally has two legal codes creating criminal offenses and punishments for breach of those offenses, the Criminal Code Law, applying to most states in Southern Nigeria and the Penal Code applying to states in Northern Nigeria. Individual State laws such as the Ekiti State and Lagos State Gender-Based Violence laws are also discussed. This paper addresses Gender-Based Violence in Nigeria and exposes the inadequacies in the laws and their application. The paper postulates that there is a need for more workable public policy that strengthens the social structure fortified by the law in order to engender the necessary changes and provide the opportunity for government to embark on grassroots-based advocacy that engage the victims and sensitize them of their rights and how they can enjoy some of the protections afforded by the laws.

Keywords: gender, violence, human rights, law and policy

Procedia PDF Downloads 611
2642 Site Specific Ground Response Estimations for the Vulnerability Assessment of the Buildings of the Third Biggest Mosque in the World, Algeria’s Mosque

Authors: S. Mohamadi, T. Boudina, A. Rouabeh, A. Seridi

Abstract:

Equivalent linear and non-linear ground response analyses are conducted at many representative sites at the mosque of Algeria, to compare the free field acceleration spectra with local code of practice. Spectral Analysis of Surface Waves (SASW) technique was adopted to measure the in-situ shear wave velocity profile at the representative sites. The seismic movement imposed on the rock is the NS component of Keddara station recorded during the earthquake in Boumerdes 21 May 2003. The site-specific elastic design spectra for each site are determined to further obtain site specific non-linear acceleration spectra. As a case study, the results of site-specific evaluations are presented for two building sites (site of minaret and site of the prayer hall) to demonstrate the influence of local geological conditions on ground response at Algerian sites. A comparison of computed response with the standard code of practice being used currently in Algeria for the seismic zone of Algiers indicated that the design spectra is not able to capture site amplification due to local geological conditions.

Keywords: equivalent linear, non-linear, ground response analysis, design response spectrum

Procedia PDF Downloads 448
2641 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval

Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje

Abstract:

Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.

Keywords: indexing, retrieval, multimedia, graph algorithm, graph code

Procedia PDF Downloads 161
2640 Behavior Factors Evaluation for Reinforced Concrete Structures

Authors: Muhammad Rizwan, Naveed Ahmad, Akhtar Naeem Khan

Abstract:

Seismic behavior factors are evaluated for the performance assessment of low rise reinforced concrete RC frame structures based on experimental study of unidirectional dynamic shake table testing of two 1/3rd reduced scaled two storey frames, with a code confirming special moment resisting frame (SMRF) model and a noncompliant model of similar characteristics but built in low strength concrete .The models were subjected to a scaled accelerogram record of 1994 Northridge earthquake to deformed the test models to final collapse stage in order to obtain the structural response parameters. The fully compliant model was observed with more stable beam-sway response, experiencing beam flexure yielding and ground-storey column base yielding upon subjecting to 100% of the record. The response modification factor - R factor obtained for the code complaint and deficient prototype structures were 7.5 and 4.5 respectively, which is about 10% and 40% less than the UBC-97 specified value for special moment resisting reinforced concrete frame structures.

Keywords: Northridge 1994 earthquake, reinforced concrete frame, response modification factor, shake table testing

Procedia PDF Downloads 173
2639 Collapse Capacity Assessment of Inelastic Structures under Seismic Sequences

Authors: Shahrzad Mohammadi, Ghasem Boshrouei Sharq

Abstract:

All seismic design codes are based on the determination of the design earthquake without taking into account the effects of aftershocks in the design practice. In regions with a high level of seismicity, the occurrence of several aftershocks of various magnitudes and different time lags is very likely. This research aims to estimate the collapse capacity of a 10-story steel bundled tube moment frame subjected to as-recorded seismic sequences. The studied structure is designed according to the seismic regulations of the fourth revision of the Iranian code of practice for the seismic-resistant design of buildings (Code No.2800). A series of incremental dynamic analyses (IDA) is performed up to the collapse level of the intact structure. Then, in order to demonstrate the effects of aftershock events on the collapse vulnerability of the building, aftershock IDA analyzes are carried out. To gain deeper insight, collapse fragility curves are developed and compared for both series. Also, a study on the influence of various ground motion characteristics on collapse capacity is carried out. The results highlight the importance of considering the decisive effects of aftershocks in seismic codes due to their contribution to the occurrence of collapse.

Keywords: IDA, aftershock, bundled tube frame, fragility assessment, GM characteristics, as-recorded seismic sequences

Procedia PDF Downloads 141
2638 Testing and Validation Stochastic Models in Epidemiology

Authors: Snigdha Sahai, Devaki Chikkavenkatappa Yellappa

Abstract:

This study outlines approaches for testing and validating stochastic models used in epidemiology, focusing on the integration and functional testing of simulation code. It details methods for combining simple functions into comprehensive simulations, distinguishing between deterministic and stochastic components, and applying tests to ensure robustness. Techniques include isolating stochastic elements, utilizing large sample sizes for validation, and handling special cases. Practical examples are provided using R code to demonstrate integration testing, handling of incorrect inputs, and special cases. The study emphasizes the importance of both functional and defensive programming to enhance code reliability and user-friendliness.

Keywords: computational epidemiology, epidemiology, public health, infectious disease modeling, statistical analysis, health data analysis, disease transmission dynamics, predictive modeling in health, population health modeling, quantitative public health, random sampling simulations, randomized numerical analysis, simulation-based analysis, variance-based simulations, algorithmic disease simulation, computational public health strategies, epidemiological surveillance, disease pattern analysis, epidemic risk assessment, population-based health strategies, preventive healthcare models, infection dynamics in populations, contagion spread prediction models, survival analysis techniques, epidemiological data mining, host-pathogen interaction models, risk assessment algorithms for disease spread, decision-support systems in epidemiology, macro-level health impact simulations, socioeconomic determinants in disease spread, data-driven decision making in public health, quantitative impact assessment of health policies, biostatistical methods in population health, probability-driven health outcome predictions

Procedia PDF Downloads 6
2637 Impact on Underprivileged People Practising Expressive Textile Arts: An Exploratory Study Applied to Ex-Offenders in Hong Kong

Authors: Jin Lam, Joe Au

Abstract:

This study aims to investigate the impact of practicing expressive textile arts on the underprivileged people namely, ex-offenders after taking a three-month textile arts and fashion creativity workshops from a service-learning subject, offered by the Hong Kong Polytechnic University in May 2016. In this service-learning subject, the subject lecturers, students and ex-offenders co-designed various expressive textile artworks together. During the creative process, the ex-offenders could enhance their self-confidence and rebuild a satisfactory identity through practicing expressive textile arts and fashion creativity. Ten textile arts prototypes in the format of fashion garments were presented in a mini fashion show and an exhibition, both at the Hong Kong Polytechnic University in July 2016. A quantitative research method was adopted and a questionnaire survey was conducted in this study. The research findings suggest that positive impacts are found on the ex-offenders’ perceptions of ‘feelings and thoughts before attending the workshops’, ‘feelings and thoughts during the workshops’, ‘attitude toward the textile arts materials’, and ‘attitude toward the expressive textile artworks’.

Keywords: creativity, design, expressive textile arts, fashion, underprivileged people

Procedia PDF Downloads 388
2636 The Social Aspects of Code-Switching in Online Interaction: The Case of Saudi Bilinguals

Authors: Shirin Alabdulqader

Abstract:

This research aims to investigate the concept of code-switching (CS) between English, Arabic, and the CS practices of Saudi online users via a Translanguaging (TL) lens for more inclusive view towards the nature of the data from the study. It employs Digitally Mediated Communication (DMC), specifically the WhatsApp and Twitter platforms, in order to understand how the users employ online resources to communicate with others on a daily basis. This project looks beyond language and considers the multimodal affordances (visual and audio means) that interlocutors utilise in their online communicative practices to shape their online social existence. This exploratory study is based on a data-driven interpretivist epistemology as it aims to understand how meaning (reality) is created by individuals within different contexts. This project used a mixed-method approach, combining a qualitative and a quantitative approach. In the former, data were collected from online chats and interview responses, while in the latter a questionnaire was employed to understand the frequency and relations between the participants’ linguistic and non-linguistic practices and their social behaviours. The participants were eight bilingual Saudi nationals (both men and women, aged between 20 and 50 years old) who interacted with others online. These participants provided their online interactions, participated in an interview and responded to a questionnaire. The study data were gathered from 194 WhatsApp chats and 122 Tweets. These data were analysed and interpreted according to three levels: conversational turn taking and CS; the linguistic description of the data; and CS and persona. This project contributes to the emerging field of analysing online Arabic data systematically, and the field of multimodality and bilingual sociolinguistics. The findings are reported for each of the three levels. For conversational turn taking, the CS analysis revealed that it was used to accomplish negotiation and develop meaning in the conversation. With regard to the linguistic practices of the CS data, the majority of the code-switched words were content morphemes. The third level of data interpretation is CS and its relationship with identity; two types of identity were indexed; absolute identity and contextual identity. This study contributes to the DMC literature and bridges some of the existing gaps. The findings of this study are that CS by its nature, and most of the findings, if not all, support the notion of TL that multiliteracy is one’s ability to decode multimodal communication, and that this multimodality contributes to the meaning. Either this is applicable to the online affordances used by monolinguals or multilinguals and perceived not only by specific generations but also by any online multiliterates, the study provides the linguistic features of CS utilised by Saudi bilinguals and it determines the relationship between these features and the contexts in which they appear.

Keywords: social media, code-switching, translanguaging, online interaction, saudi bilinguals

Procedia PDF Downloads 131
2635 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: network worms, malware infection propagating malicious code, virus, security, VPN

Procedia PDF Downloads 358
2634 Diagnostic Delays and Treatment Dilemmas: A Case of Drug-Resistant HIV and Tuberculosis

Authors: Christi Jackson, Chuka Onaga

Abstract:

Introduction: We report a case of delayed diagnosis of extra-pulmonary INH-mono-resistant Tuberculosis (TB) in a South African patient with drug-resistant HIV. Case Presentation: A 36-year old male was initiated on 1st line (NNRTI-based) anti-retroviral therapy (ART) in September 2009 and switched to 2nd line (PI-based) ART in 2011, according to local guidelines. He was following up at the outpatient wellness unit of a public hospital, where he was diagnosed with Protease Inhibitor resistant HIV in March 2016. He had an HIV viral load (HIVVL) of 737000 copies/mL, CD4-count of 10 cells/µL and presented with complaints of productive cough, weight loss, chronic diarrhoea and a septic buttock wound. Several investigations were done on sputum, stool and pus samples but all were negative for TB. The patient was treated with antibiotics and the cough and the buttock wound improved. He was subsequently started on a 3rd-line ART regimen of Darunavir, Ritonavir, Etravirine, Raltegravir, Tenofovir and Emtricitabine in May 2016. He continued losing weight, became too weak to stand unsupported and started complaining of abdominal pain. Further investigations were done in September 2016, including a urine specimen for Line Probe Assay (LPA), which showed M. tuberculosis sensitive to Rifampicin but resistant to INH. A lymph node biopsy also showed histological confirmation of TB. Management and outcome: He was started on Rifabutin, Pyrazinamide and Ethambutol in September 2016, and Etravirine was discontinued. After 6 months on ART and 2 months on TB treatment, his HIVVL had dropped to 286 copies/mL, CD4 improved to 179 cells/µL and he showed clinical improvement. Pharmacy supply of his individualised drugs was unreliable and presented some challenges to continuity of treatment. He successfully completed his treatment in June 2017 while still maintaining virological suppression. Discussion: Several laboratory-related factors delayed the diagnosis of TB, including the unavailability of urine-lipoarabinomannan (LAM) and urine-GeneXpert (GXP) tests at this facility. Once the diagnosis was made, it presented a treatment dilemma due to the expected drug-drug interactions between his 3rd-line ART regimen and his INH-resistant TB regimen, and specialist input was required. Conclusion: TB is more difficult to diagnose in patients with severe immunosuppression, therefore additional tests like urine-LAM and urine-GXP can be helpful in expediting the diagnosis in these cases. Patients with non-standard drug regimens should always be discussed with a specialist in order to avoid potentially harmful drug-drug interactions.

Keywords: drug-resistance, HIV, line probe assay, tuberculosis

Procedia PDF Downloads 169
2633 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches

Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez

Abstract:

Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.

Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation

Procedia PDF Downloads 346
2632 Memory Retrieval and Implicit Prosody during Reading: Anaphora Resolution by L1 and L2 Speakers of English

Authors: Duong Thuy Nguyen, Giulia Bencini

Abstract:

The present study examined structural and prosodic factors on the computation of antecedent-reflexive relationships and sentence comprehension in native English (L1) and Vietnamese-English bilinguals (L2). Participants read sentences presented on the computer screen in one of three presentation formats aimed at manipulating prosodic parsing: word-by-word (RSVP), phrase-segment (self-paced), or whole-sentence (self-paced), then completed a grammaticality rating and a comprehension task (following Pratt & Fernandez, 2016). The design crossed three factors: syntactic structure (simple; complex), grammaticality (target-match; target-mismatch) and presentation format. An example item is provided in (1): (1) The actress that (Mary/John) interviewed at the awards ceremony (about two years ago/organized outside the theater) described (herself/himself) as an extreme workaholic). Results showed that overall, both L1 and L2 speakers made use of a good-enough processing strategy at the expense of more detailed syntactic analyses. L1 and L2 speakers’ comprehension and grammaticality judgements were negatively affected by the most prosodically disrupting condition (word-by-word). However, the two groups demonstrated differences in their performance in the other two reading conditions. For L1 speakers, the whole-sentence and the phrase-segment formats were both facilitative in the grammaticality rating and comprehension tasks; for L2, compared with the whole-sentence condition, the phrase-segment paradigm did not significantly improve accuracy or comprehension. These findings are consistent with the findings of Pratt & Fernandez (2016), who found a similar pattern of results in the processing of subject-verb agreement relations using the same experimental paradigm and prosodic manipulation with English L1 and L2 English-Spanish speakers. The results provide further support for a Good-Enough cue model of sentence processing that integrates cue-based retrieval and implicit prosodic parsing (Pratt & Fernandez, 2016) and highlights similarities and differences between L1 and L2 sentence processing and comprehension.

Keywords: anaphora resolution, bilingualism, implicit prosody, sentence processing

Procedia PDF Downloads 152
2631 Greenland Monitoring Using Vegetation Index: A Case Study of Lal Suhanra National Park

Authors: Rabia Munsaf Khan, Eshrat Fatima

Abstract:

The analysis of the spatial extent and temporal change of vegetation cover using remotely sensed data is of critical importance to agricultural sciences. Pakistan, being an agricultural country depends on this resource as it makes 70% of the GDP. The case study is of Lal Suhanra National Park, which is not only the biggest forest reserve of Pakistan but also of Asia. The study is performed using different temporal images of Landsat. Also, the results of Landsat are cross-checked by using Sentinel-2 imagery as it has both higher spectral and spatial resolution. Vegetation can easily be detected using NDVI which is a common and widely used index. It is an important vegetation index, widely applied in research on global environmental and climatic change. The images are then classified to observe the change occurred over 15 years. Vegetation cover maps of 2000 and 2016 are used to generate the map of vegetation change detection for the respective years and to find out the changing pattern of vegetation cover. Also, the NDVI values aided in the detection of percentage decrease in vegetation cover. The study reveals that vegetation cover of the area has decreased significantly during the year 2000 and 2016.

Keywords: Landsat, normalized difference vegetation index (NDVI), sentinel 2, Greenland monitoring

Procedia PDF Downloads 309
2630 Establishing the Legality of Terraforming under the Outer Space Treaty

Authors: Bholenath

Abstract:

Ever since Elon Musk revealed his plan to terraform Mars on national television in 2015, the debate regarding the legality of such an activity under the current Outer Space Treaty regime is gaining momentum. Terraforming means to alter or transform the atmosphere of another planet to have the characteristics of landscapes on Earth. Musk’s plan is to alter the entire environment of Mars so as to make it habitable for humans. He has long been an advocate of colonizing Mars, and in order to make humans an interplanetary species; he wants to detonate thermonuclear devices over the poles of Mars. For a common man, it seems to be a fascinating endeavor, but for space lawyers, it poses new and fascinating legal questions. Some of the questions which arise are whether the use of nuclear weapons on celestial bodies is permitted under the Outer Space Treaty? Whether such an alteration of the celestial environment would fall within the scope of the term 'harmful contamination' under Article IX of the treaty? Whether such an activity which would put an entire planet under the control of a private company can be permitted under the treaty? Whether such terraforming of Mars would amount to its appropriation? Whether such an activity would be in the 'benefit and interests of all countries'? This paper will be attempt to examine and elucidate upon these legal questions. Space is one such domain where the law should precede man. The paper follows the approach that the de lege lata is not capable of prohibiting the terraforming of Mars. Outer Space Treaty provides the freedoms of space and prescribes certain restrictions on those freedoms as well. The author shall examine the provisions such as Article I, II, IV, and IX of the Outer Space Treaty in order to establish the legality of terraforming activity. The author shall establish how such activity is peaceful use of the celestial body, is in the benefit and interests of all countries, and does neither qualify as national appropriation of the celestial body nor as its harmful contamination. The author shall divide the paper into three chapters. The first chapter would be about the general introduction of the problem, the analysis of Elon Musk’s plan to terraform Mars, and the need to study terraforming from the lens of the Outer Space Treaty. In the second chapter, the author shall attempt to establish the legality of the terraforming activity under the provisions of the Outer Space Treaty. In this vein, the author shall put forth the counter interpretations and the arguments which may be formulated against the lawfulness of terraforming. The author shall show as to why the counter interpretations establishing the unlawfulness of terraforming should not be accepted, and in doing so, the author shall provide the interpretations that should prevail and ultimately establishes the legality of terraforming activity under the treaty. In the third chapter, the author shall draw relevant conclusions and give suggestions.

Keywords: appropriation, harmful contamination, peaceful, terraforming

Procedia PDF Downloads 153
2629 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement

Authors: Hadi Ardiny, Amir Mohammad Beigzadeh

Abstract:

Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.

Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems

Procedia PDF Downloads 123
2628 An Epistemological Approach of the Social Movements Studies in Cali (Colombia) between 2002 and 2016

Authors: Faride Crespo Razeg, Beatriz Eugenia Rivera Pedroza

Abstract:

While Colombian’s society has changed, the way that Colombian’s civil society participates has changed too. Thus, the social movements as a form of participation should be research to understand as the society structure as the groups’ interactions. In fact, in the last decades, the social movements in Colombia have been transformed in three categories: actors, spaces, and demands. For this reason, it is important to know from what perspectives have been researched this topic, allowing to recognize an epistemological and ontological reflections of it. The goal of this research has been characterizing the social movements of Cali – Colombia between 2002 and 2016. Cali is the southwest largest Colombian city; for this reason, it could be considered as a representative data for the social dynamic of the region. Qualitative methods as documental analysis have been used, in order to know the way that the research on social movements has been done. Thus taking into account this methodological technique, it has been found the goals that are present in most of the studies, which represents what are the main concerns around this topic. Besides, the methodology more used, to understand the way that the data was collected, its problems and its advantages. Finally, the ontological and epistemological reflections are important to understand which have been the theory and conceptual approach of the studies and how its have been contextualized to Cali, taking into account its own history.

Keywords: social movements, civil society, forms of participation, collective actions

Procedia PDF Downloads 288
2627 Hunting Ban, Unfortunate Decisions for the Bear Population in Romania

Authors: Alexandru Gridan, Georgeta Ionescu, Ovidiu Ionescu, Ramon Jurj, George Sirbu, Mihai Fedorca

Abstract:

The Brown Bear population size in Romania is approximately 7300-7600 individuals, which is projected to be 3000 individuals over the ecological carrying capacity. The Habitats Directive imposed certain protection rules on European Union (EU) Member States with Brown Bear populations. These however allow countries like Sweden, Croatia, Slovakia, Estonia to hunting as management tool, harvesting up to 10% of the surplus bear population annually. From the point Romania joined the EU to 2016, active conservation management has contributed to maintaining the highest and most genetically diverse Brown Bear population in Europe. Importantly, there has been good coexistence between people and bears and low levels of human-bear conflict. After social pressure and campaigning by some non-governmental organisations citing issues over monitoring, the environment minister decided in September 2016 to stop the use of hunting as a management tool for bears. Against this background, this paper provides a set of recommendations to resolve the current conflict in Romania. These include the need for collaborative decision-making to reduce conflicts between stakeholders and mechanisms to reduce current human-bear conflicts, which have increased by 50 percent in the past year.

Keywords: bear, bear population, bear management, wildlife conflict

Procedia PDF Downloads 182
2626 Learning Mandarin Chinese as a Foreign Language in a Bilingual Context: Adult Learners’ Perceptions of the Use of L1 Maltese and L2 English in Mandarin Chinese Lessons in Malta

Authors: Christiana Gauci-Sciberras

Abstract:

The first language (L1) could be used in foreign language teaching and learning as a pedagogical tool to scaffold new knowledge in the target language (TL) upon linguistic knowledge that the learner already has. In a bilingual context, code-switching between the two languages usually occurs in classrooms. One of the reasons for code-switching is because both languages are used for scaffolding new knowledge. This research paper aims to find out why both the L1 (Maltese) and the L2 (English) are used in the classroom of Mandarin Chinese as a foreign language (CFL) in the bilingual context of Malta. This research paper also aims to find out the learners’ perceptions of the use of a bilingual medium of instruction. Two research methods were used to collect qualitative data; semi-structured interviews with adult learners of Mandarin Chinese and lesson observations. These two research methods were used so that the data collected in the interviews would be triangulated with data collected in lesson observations. The L1 (Maltese) is the language of instruction mostly used. The teacher and the learners switch to the L2 (English) or to any other foreign language according to the need at a particular instance during the lesson.

Keywords: Chinese, bilingual, pedagogical purpose of L1 and L2, CFL acquisition

Procedia PDF Downloads 200
2625 Implementation of European Court of Human Right Judgments and State Sovereignty

Authors: Valentina Tereshkova

Abstract:

The paper shows how the relationship between international law and national sovereignty is viewed through the implementation of European Court of Human Right judgments. Methodology: Сonclusions are based on a survey of representatives of the legislative authorities and judges of the Krasnoyarsk region, the Rostov region, Sverdlovsk region and Tver region. The paper assesses the activities of the Russian Constitutional Court from 1998 to 2015 related to the establishment of the implementation mechanism and the Russian Constitutional Court judgments of 14.07.2015, № 21-P and of 19.04.2016, № 12-P where the Constitutional Court stated the impossibility of executing ECtHR judgments. I. Implementation of ECHR judgments by courts and other authorities. Despite the publication of the report of the RF Ministry of Justice on the implementation, we could not find any formal information on the Russian policy of the ECtHR judgment implementation. Using the results of the survey, the paper shows the effect of ECtHR judgments on law and legal practice in Russia. II. Implementation of ECHR judgments by Russian Constitutional Court. Russian Constitutional Court had implemented the ECtHR judgments. However, the Court determined on July, 14, 2015 its competence to consider the question of implementation of ECHR judgments. Then, it stated that the execution of the judgment [Anchugov and Gladkov case] was impossible because the Russian Constitution has the highest legal force on April, 19, 2016. Recently the CE Committee of Ministers asked Russia to provide ‘without further delay’ a compensation plan for the Yukos case. On November 11, 2016, Constitutional Court accepted a request from the Ministry of Justice to consider the possibility of execution of the ECtHR judgment in the Yukos case. Such a request has been made possible due to a lack of implementation mechanism. Conclusion: ECtHR judgments are as an effective tool to solve the structural problems of a legal system. However, Russian experts consider the ECHR as a tool of protection of individual rights. The paper shows link between the survey results and the absence of the implementation mechanism. New Article 104 par. 2 and Article 106 par. 2 of the Federal Law of the Constitutional Court are in conflict with international obligations of the Convention on the Law on Treaties 1969 and Article 46 ECHR. Nevertheless, a dialogue may be possible between Constitutional Court and the ECtHR. In its judgment [19.04.2016] the Constitutional Court determined that the general measures to ensure fairness, proportionality and differentiation of the restrictions of voting rights were possible in judicial practice. It also stated the federal legislator had the power ‘to optimize the system of Russian criminal penalties’. Despite the fact that the Constitutional Court presented the Görgülü case [Görgülü v Germany] as an example of non-execution of the ECtHR judgment, the paper proposes to draw on the experience of German Constitutional Court, which in the Görgülü case, on the one hand, stressed national sovereignty and, on the other hand, took advantage of this sovereignty, to resolve the issue in accordance with the ECHR.

Keywords: implementation of ECtHR judgments, sovereignty, supranational jurisdictions, principle of subsidiarity

Procedia PDF Downloads 193
2624 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 207
2623 Seismic Behavior of Existing Reinforced Concrete Buildings in California under Mainshock-Aftershock Scenarios

Authors: Ahmed Mantawy, James C. Anderson

Abstract:

Numerous cases of earthquakes (main-shocks) that were followed by aftershocks have been recorded in California. In 1992 a pair of strong earthquakes occurred within three hours of each other in Southern California. The first shock occurred near the community of Landers and was assigned a magnitude of 7.3 then the second shock occurred near the city of Big Bear about 20 miles west of the initial shock and was assigned a magnitude of 6.2. In the same year, a series of three earthquakes occurred over two days in the Cape-Mendocino area of Northern California. The main-shock was assigned a magnitude of 7.0 while the second and the third shocks were both assigned a value of 6.6. This paper investigates the effect of a main-shock accompanied with aftershocks of significant intensity on reinforced concrete (RC) frame buildings to indicate nonlinear behavior using PERFORM-3D software. A 6-story building in San Bruno and a 20-story building in North Hollywood were selected for the study as both of them have RC moment resisting frame systems. The buildings are also instrumented at multiple floor levels as a part of the California Strong Motion Instrumentation Program (CSMIP). Both buildings have recorded responses during past events such as Loma-Prieta and Northridge earthquakes which were used in verifying the response parameters of the numerical models in PERFORM-3D. The verification of the numerical models shows good agreement between the calculated and the recorded response values. Then, different scenarios of a main-shock followed by a series of aftershocks from real cases in California were applied to the building models in order to investigate the structural behavior of the moment-resisting frame system. The behavior was evaluated in terms of the lateral floor displacements, the ductility demands, and the inelastic behavior at critical locations. The analysis results showed that permanent displacements may have happened due to the plastic deformation during the main-shock that can lead to higher displacements during after-shocks. Also, the inelastic response at plastic hinges during the main-shock can change the hysteretic behavior during the aftershocks. Higher ductility demands can also occur when buildings are subjected to trains of ground motions compared to the case of individual ground motions. A general conclusion is that the occurrence of aftershocks following an earthquake can lead to increased damage within the elements of an RC frame buildings. Current code provisions for seismic design do not consider the probability of significant aftershocks when designing a new building in zones of high seismic activity.

Keywords: reinforced concrete, existing buildings, aftershocks, damage accumulation

Procedia PDF Downloads 280
2622 Estimation of Adult Patient Doses for Chest X-Ray Diagnostic Examinations in a Tertiary Institution Health Centre

Authors: G. E. Okungbowa, H. O. Adams, S. E. Eze

Abstract:

This study is on the estimation of adult patient doses for Chest X-ray diagnostic examinations of new admitted undergraduate students attending a tertiary institution health centre as part of their routine clearance and check up on admitted into the institution. A total of 531 newly admitted undergraduate students were recruited for this survey in the first quarter of 2016 (January to March, 2016). CALDOSE_X 5.0 software was used to compute the Entrance Surface Dose (ESD) and Effective Dose (ED); while the Statistical Package for Social Sciences (SPSS) version 21.0 was used to carry out the statistical analyses. The basic patients' data and exposure parameters required for the software are age, sex, examination type, projection posture, tube potential and current-time product. The mean Entrance Surface Dose and Effective Doses of the undergraduate students were calculated using the software, and the values were compared with existing literature and internationally established diagnostic reference levels. The mean ESD calculated is 0.29 mGy, and the mean effective dose is 0.04 mSv. The values of ESD and ED obtained are below the internationally established diagnostic reference levels, which could be attributed to good radiographic techniques employed during the chest X-ray procedure for these students.

Keywords: x-ray, dose, examination, chest

Procedia PDF Downloads 183
2621 Inviscid Steady Flow Simulation Around a Wing Configuration Using MB_CNS

Authors: Muhammad Umar Kiani, Muhammad Shahbaz, Hassan Akbar

Abstract:

Simulation of a high speed inviscid steady ideal air flow around a 2D/axial-symmetry body was carried out by the use of mb_cns code. mb_cns is a program for the time-integration of the Navier-Stokes equations for two-dimensional compressible flows on a multiple-block structured mesh. The flow geometry may be either planar or axisymmetric and multiply-connected domains can be modeled by patching together several blocks. The main simulation code is accompanied by a set of pre and post-processing programs. The pre-processing programs scriptit and mb_prep start with a short script describing the geometry, initial flow state and boundary conditions and produce a discretized version of the initial flow state. The main flow simulation program (or solver as it is sometimes called) is mb_cns. It takes the files prepared by scriptit and mb_prep, integrates the discrete form of the gas flow equations in time and writes the evolved flow data to a set of output files. This output data may consist of the flow state (over the whole domain) at a number of instants in time. After integration in time, the post-processing programs mb_post and mb_cont can be used to reformat the flow state data and produce GIF or postscript plots of flow quantities such as pressure, temperature and Mach number. The current problem is an example of supersonic inviscid flow. The flow domain for the current problem (strake configuration wing) is discretized by a structured grid and a finite-volume approach is used to discretize the conservation equations. The flow field is recorded as cell-average values at cell centers and explicit time stepping is used to update conserved quantities. MUSCL-type interpolation and one of three flux calculation methods (Riemann solver, AUSMDV flux splitting and the Equilibrium Flux Method, EFM) are used to calculate inviscid fluxes across cell faces.

Keywords: steady flow simulation, processing programs, simulation code, inviscid flux

Procedia PDF Downloads 429
2620 Young People, the Internet and Inequality: What are the Causes and Consequences of Exclusion?

Authors: Albin Wallace

Abstract:

Part of the provision within educational institutions is the design, commissioning and implementation of ICT facilities to improve teaching and learning. Inevitably, these facilities focus largely on Internet Protocol (IP) based provisions including access to the World Wide Web, email, interactive software and hardware tools. Educators should be committed to the use of ICT to improve learning and teaching as well as to issues relating to the Internet and educational disadvantage, especially with respect to access and exclusion concerns. In this paper I examine some recent research into the issue of inequality and use of the Internet during which I discuss the causes and consequences of exclusion in the context of social inequality, digital literacy and digital inequality, also touching on issues of global inequality.

Keywords: inequality, internet, education, design

Procedia PDF Downloads 488
2619 Ground Improvement Using Deep Vibro Techniques at Madhepura E-Loco Project

Authors: A. Sekhar, N. Ramakrishna Raju

Abstract:

This paper is a result of ground improvement using deep vibro techniques with combination of sand and stone columns performed on a highly liquefaction susceptible site (70 to 80% sand strata and balance silt) with low bearing capacities due to high settlements located (earth quake zone V as per IS code) at Madhepura, Bihar state in northern part of India. Initially, it was envisaged with bored cast in-situ/precast piles, stone/sand columns. However, after detail analysis to address both liquefaction and improve bearing capacities simultaneously, it was analyzed the deep vibro techniques with combination of sand and stone columns is excellent solution for given site condition which may be first time in India. First after detail soil investigation, pre eCPT test was conducted to evaluate the potential depth of liquefaction to densify silty sandy soils to improve factor of safety against liquefaction. Then trail test were being carried out at site by deep vibro compaction technique with sand and stone columns combination with different spacings of columns in triangular shape with different timings during each lift of vibro up to ground level. Different spacings and timing was done to obtain the most effective spacing and timing with vibro compaction technique to achieve maximum densification of saturated loose silty sandy soils uniformly for complete treated area. Then again, post eCPT test and plate load tests were conducted at all trail locations of different spacings and timing of sand and stone columns to evaluate the best results for obtaining the required factor of safety against liquefaction and the desired bearing capacities with reduced settlements for construction of industrial structures. After reviewing these results, it was noticed that the ground layers are densified more than the expected with improved factor of safety against liquefaction and achieved good bearing capacities for a given settlements as per IS codal provisions. It was also worked out for cost-effectiveness of lightly loaded single storied structures by using deep vibro technique with sand column avoiding stone. The results were observed satisfactory for resting the lightly loaded foundations. In this technique, the most important is to mitigating liquefaction with improved bearing capacities and reduced settlements to acceptable limits as per IS: 1904-1986 simultaneously up to a depth of 19M. To our best knowledge it was executed first time in India.

Keywords: ground improvement, deep vibro techniques, liquefaction, bearing capacity, settlement

Procedia PDF Downloads 197
2618 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 129