Search results for: big data interpretation
25484 Campaign Contributions as Freedom of Expression: A Comparative Study Between the United States and Germany
Authors: Kristof Lukas Heidemann
Abstract:
In times of democratic backsliding in Western nations restoring public trust in the electoral process ranks among the most urgent tasks on the public agenda. Addressing the role of money in politics is one major part of this effort, however, such an endeavor might affect the constitutional freedom of expression. Attempts to regulate political spending in the U.S. have in recent decades increasingly been overruled by the U.S. Supreme through an expansion of the protective umbrella of the First Amendment over campaign contributions by private organizations, especially in the decisions Buckley v. Valeo and Citizens United v. FEC. In Germany on the other hand this line of argumentation has so far not been submitted to the national Supreme Court. Given that voices calling for stricter and more transparent political financing laws in Germany are growing, it seems only a matter of time until the issue will have to be addressed by the country’s judiciary as well. Therefore, this paper conducts a comparative analysis of the constitutional right to free expression in these two leading democracies in to assess whether the problem of a lack of regulatory options to achieve stricter campaign spending laws due to constitutional restrictions will also arise in Germany. In order to present a comprehensive picture of the subject, the analysis does not only touch upon doctrinal aspects of both systems but also scrutinizes the practical implications from a socio-legal perspective. Although the list of forms of expression in the wording of Art. 5 of the German constitution is generally considered to be non-exhaustive, the investigation concludes that the subsumption of election campaign donations under it is not justifiable using recognized methods of interpretation, in particular concerning a systematic interpretation in light of the principle of equality in Art. 3 of the German constitution.Keywords: comparative constitutional law, constitutional justice, constitutional law, election law, freedom of speech, fundamental rights, law reform
Procedia PDF Downloads 725483 Frontier Dynamic Tracking in the Field of Urban Plant and Habitat Research: Data Visualization and Analysis Based on Journal Literature
Authors: Shao Qi
Abstract:
The article uses the CiteSpace knowledge graph analysis tool to sort and visualize the journal literature on urban plants and habitats in the Web of Science and China National Knowledge Infrastructure databases. Based on a comprehensive interpretation of the visualization results of various data sources and the description of the intrinsic relationship between high-frequency keywords using knowledge mapping, the research hotspots, processes and evolution trends in this field are analyzed. Relevant case studies are also conducted for the hotspot contents to explore the means of landscape intervention and synthesize the understanding of research theories. The results show that (1) from 1999 to 2022, the research direction of urban plants and habitats gradually changed from focusing on plant and animal extinction and biological invasion to the field of human urban habitat creation, ecological restoration, and ecosystem services. (2) The results of keyword emergence and keyword growth trend analysis show that habitat creation research has shown a rapid and stable growth trend since 2017, and ecological restoration has gained long-term sustained attention since 2004. The hotspots of future research on urban plants and habitats in China may focus on habitat creation and ecological restoration.Keywords: research trends, visual analysis, habitat creation, ecological restoration
Procedia PDF Downloads 6125482 Deconstructing and Reconstructing the Definition of Inhuman Treatment in International Law
Authors: Sonia Boulos
Abstract:
The prohibition on ‘inhuman treatment’ constitutes one of the central tenets of modern international human rights law. It is incorporated in principal international human rights instruments including Article 5 of the Universal Declaration of Human Rights, and Article 7 of the International Covenant on Civil and Political Rights. However, in the absence of any legislative definition of the term ‘inhuman’, its interpretation becomes challenging. The aim of this article is to critically analyze the interpretation of the term ‘inhuman’ in international human rights law and to suggest a new approach to construct its meaning. The article is composed of two central parts. The first part is a critical appraisal of the interpretation of the term ‘inhuman’ by supra-national human rights law institutions. It highlights the failure of supra-national institutions to provide an independent definition for the term ‘inhuman’. In fact, those institutions consistently fail to distinguish the term ‘inhuman’ from its other kin terms, i.e. ‘cruel’ and ‘degrading.’ Very often, they refer to these three prohibitions as ‘CIDT’, as if they were one collective. They were primarily preoccupied with distinguishing ‘CIDT’ from ‘torture.’ By blurring the conceptual differences between these three terms, supra-national institutions supplemented them with a long list of specific and purely descriptive subsidiary rules. In most cases, those subsidiary rules were announced in the absence of sufficient legal reasoning explaining how they were derived from abstract and evaluative standards embodied in the prohibitions collectively referred to as ‘CIDT.’ By opting for this option, supra-national institutions have created the risk for the development of an incoherent body of jurisprudence on those terms at the international level. They also have failed to provide guidance for domestic courts on how to enforce these prohibitions. While blurring the differences between the terms ‘cruel,’ ‘inhuman,’ and ‘degrading’ has consequences for the three, the term ‘inhuman’ remains the most impoverished one. It is easy to link the term ‘cruel’ to the clause on ‘cruel and unusual punishment’ originating from the English Bill of Rights of 1689. It is also easy to see that the term ‘degrading’ reflects a dignatarian ideal. However, when we turn to the term ‘inhuman’, we are left without any interpretative clue. The second part of the article suggests that the ordinary meaning of the word ‘inhuman’ should be our first clue. However, regaining the conceptual independence of the term ‘inhuman’ requires more than a mere reflection on the word-meaning of the term. Thus, the second part introduces philosophical concepts related to the understanding of what it means to be human. It focuses on ‘the capabilities approach’ and the notion of ‘human functioning’, introduced by Amartya Sen and further explored by Martha Nussbaum. Nussbaum’s work on the basic human capabilities is particularly helpful or even vital for understanding the moral and legal substance of the prohibition on ‘inhuman’ treatment.Keywords: inhuman treatment, capabilities approach, human functioning, supra-national institutions
Procedia PDF Downloads 27825481 Imaging 255nm Tungsten Thin Film Adhesion with Picosecond Ultrasonics
Authors: A. Abbas, X. Tridon, J. Michelon
Abstract:
In the electronic or in the photovoltaic industries, components are made from wafers which are stacks of thin film layers of a few nanometers to serval micrometers thickness. Early evaluation of the bounding quality between different layers of a wafer is one of the challenges of these industries to avoid dysfunction of their final products. Traditional pump-probe experiments, which have been developed in the 70’s, give a partial solution to this problematic but with a non-negligible drawback. In fact, on one hand, these setups can generate and detect ultra-high ultrasounds frequencies which can be used to evaluate the adhesion quality of wafer layers. But, on the other hand, because of the quiet long acquisition time they need to perform one measurement, these setups remain shut in punctual measurement to evaluate global sample quality. This last point can lead to bad interpretation of the sample quality parameters, especially in the case of inhomogeneous samples. Asynchronous Optical Sampling (ASOPS) systems can perform sample characterization with picosecond acoustics up to 106 times faster than traditional pump-probe setups. This last point allows picosecond ultrasonic to unlock the acoustic imaging field at the nanometric scale to detect inhomogeneities regarding sample mechanical properties. This fact will be illustrated by presenting an image of the measured acoustical reflection coefficients obtained by mapping, with an ASOPS setup, a 255nm thin-film tungsten layer deposited on a silicone substrate. Interpretation of the coefficient reflection in terms of bounding quality adhesion will also be exposed. Origin of zones which exhibit good and bad quality bounding will be discussed.Keywords: adhesion, picosecond ultrasonics, pump-probe, thin film
Procedia PDF Downloads 15925480 Performing a Chamber Theatre Adaptation of Nick Joaquin's 'the Summer Solstice'
Authors: Allen B. Baylosis
Abstract:
Chamber Theatre has been one of the least articulated staging devices in the field of theatre and performance studies. This creative exploratory-descriptive study responds to this gap by employing the staging technique in a Chamber Theatre production based on Nick Joaquin’s The Summer Solstice. Specifically, this study opts to understand three processes involved in the Chamber Theatre creative thesis production of The Summer Solstice as performance: performance of the theatre-maker, performance of the spect-actors, and performance of the spectators. For this purpose, the theatre-maker describes the creative process of transforming The Summer Solstice text to a Chamber Theatre production—from text to staging. The theatre-maker also analyzes the performers’ experiences and the spectators’ responses as they participate in a Chamber Theatre performance. In doing so, the theatre-maker collects qualitative data from seventeen (17) performers and qualitative feedback from twenty (20) spectators. For the mode of data analysis, this study employed Ranciere’s concept on the Emancipated Spectator (2008) and Schechner’s Performance Theory (1988). The study’s findings examine how the theatre-maker, the performers, and the spectators become distant viewers of their respective restored behavior performances. Through these viewed performances, this study implies that it is possible to ascertain a reasonable definition of purpose for Chamber Theatre. Hence, despite the existence of other modern staging devices in the field of theatre and performance studies, this study concludes that Chamber Theatre remains to be a relevant staging technique.Keywords: adaptation of text, chamber theatre, experimental theater, oral interpretation
Procedia PDF Downloads 15725479 Functions and Pragmatic Aspects of English Nonsense
Authors: Natalia V. Ursul
Abstract:
In linguistic studies, the question of nonsense is attracting increasing interest. Nonsense is usually defined as spoken or written words that have no meaning. However, this definition is likely to be outdated as any speech act is generated due to the speaker’s pragmatic reasons, thus it cannot be purely illogical or meaningless. In the current paper a new working definition of nonsense as a linguistic medium will be formulated; moreover, the pragmatic peculiarities of newly coined linguistic patterns and possible ways of their interpretation will be discussed.Keywords: nonsense, nonse verse, pragmatics, speech act
Procedia PDF Downloads 51925478 Spatial Distribution of Land Use in the North Canal of Beijing Subsidiary Center and Its Impact on the Water Quality
Authors: Alisa Salimova, Jiane Zuo, Christopher Homer
Abstract:
The objective of this study is to analyse the North Canal riparian zone land use with the help of remote sensing analysis in ArcGis using 30 cloudless Landsat8 open-source satellite images from May to August of 2013 and 2017. Land cover, urban construction, heat island effect, vegetation cover, and water system change were chosen as the main parameters and further analysed to evaluate its impact on the North Canal water quality. The methodology involved the following steps: firstly, 30 cloudless satellite images were collected from the Landsat TM image open-source database. The visual interpretation method was used to determine different land types in a catchment area. After primary and secondary classification, 28 land cover types in total were classified. Visual interpretation method was used with the help ArcGIS for the grassland monitoring, US Landsat TM remote sensing image processing with a resolution of 30 meters was used to analyse the vegetation cover. The water system was analysed using the visual interpretation method on the GIS software platform to decode the target area, water use and coverage. Monthly measurements of water temperature, pH, BOD, COD, ammonia nitrogen, total nitrogen and total phosphorus in 2013 and 2017 were taken from three locations of the North Canal in Tongzhou district. These parameters were used for water quality index calculation and compared to land-use changes. The results of this research were promising. The vegetation coverage of North Canal riparian zone in 2017 was higher than the vegetation coverage in 2013. The surface brightness temperature value was positively correlated with the vegetation coverage density and the distance from the surface of the water bodies. This indicates that the vegetation coverage and water system have a great effect on temperature regulation and urban heat island effect. Surface temperature in 2017 was higher than in 2013, indicating a global warming effect. The water volume in the river area has been partially reduced, indicating the potential water scarcity risk in North Canal watershed. Between 2013 and 2017, urban residential, industrial and mining storage land areas significantly increased compared to other land use types; however, water quality has significantly improved in 2017 compared to 2013. This observation indicates that the Tongzhou Water Restoration Plan showed positive results and water management of Tongzhou district had been improved.Keywords: North Canal, land use, riparian vegetation, river ecology, remote sensing
Procedia PDF Downloads 11325477 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis
Authors: Wenbo Du, Xiaomei Ma
Abstract:
With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression
Procedia PDF Downloads 14625476 Delineation of Subsurface Tectonic Structures Using Gravity, Magnetic and Geological Data, in the Sarir-Hameimat Arm of the Sirt Basin, NE Libya
Authors: Mohamed Abdalla Saleem, Hana Ellafi
Abstract:
The study area is located in the eastern part of the Sirt Basin, in the Sarir-Hameimat arm of the basin, south of Amal High. The area covers the northern part of the Hamemat Trough and the Rakb High. All of these tectonic elements are part of the major and common tectonics that were created when the old Sirt Arch collapsed, and most of them are trending NW-SE. This study has been conducted to investigate the subsurface structures and the sedimentology characterization of the area and attempt to define its development tectonically and stratigraphically. About 7600 land gravity measurements, 22500 gridded magnetic data, and petrographic core data from some wells were used to investigate the subsurface structural features both vertically and laterally. A third-order separation of the regional trends from the original Bouguer gravity data has been chosen. The residual gravity map reveals a significant number of high anomalies distributed in the area, separated by a group of thick sediment centers. The reduction to the pole magnetic map also shows nearly the same major trends and anomalies in the area. Applying the further interpretation filters reveals that these high anomalies are sourced from different depth levels; some are deep-rooted, and others are intruded igneous bodies within the sediment layers. The petrographic sedimentology study for some wells in the area confirmed the presence of these igneous bodies and defined their composition as most likely to be gabbro hosted by marine shale layers. Depth investigation of these anomalies by the average depth spectrum shows that the average basement depth is about 7.7 km, while the top of the intrusions is about 2.65 km, and some near-surface magnetic sources are about 1.86 km. The depth values of the magnetic anomalies and their location were estimated specifically using the 3D Euler deconvolution technique. The obtained results suggest that the maximum depth of the sources is about 4938m. The total horizontal gradient of the magnetic data shows that the trends are mostly extending NW-SE, others are NE-SW, and a third group has an N-S extension. This variety in trend direction shows that the area experienced different tectonic regimes throughout its geological history.Keywords: sirt basin, tectonics, gravity, magnetic
Procedia PDF Downloads 6625475 Synthesis and Characterization of Thiourea-Formaldehyde Coated Fe3O4 (TUF@Fe3O4) and Its Application for Adsorption of Methylene Blue
Authors: Saad M. Alshehri, Tansir Ahamad
Abstract:
Thiourea-Formaldehyde Pre-Polymer (TUF) was prepared by the reaction thiourea and formaldehyde in basic medium and used as a coating materials for magnetite Fe3O4. The synthesized polymer coated microspheres (TUF@Fe3O4) was characterized using FTIR, TGA SEM and TEM. Its BET surface area was up to 1680 m2 g_1. The adsorption capacity of this ACF product was evaluated in its adsorption of Methylene Blue (MB) in water under different pH values and different temperature. We found that the adsorption process was well described both by the Langmuir and Freundlich isotherm model. The kinetic processes of MB adsorption onto TUF@Fe3O4 were described in order to provide a more clear interpretation of the adsorption rate and uptake mechanism. The overall kinetic data was acceptably explained by a pseudo second-order rate model. Evaluated ∆Go and ∆Ho specify the spontaneous and exothermic nature of the reaction. The adsorption takes place with a decrease in entropy (∆So is negative). The monolayer capacity for MB was up to 450 mg g_1 and was one of the highest among similar polymeric products. It was due to its large BET surface area.Keywords: TGA, FTIR, magentite, thiourea formaldehyde resin, methylene blue, adsorption
Procedia PDF Downloads 35025474 Implementation of an IoT Sensor Data Collection and Analysis Library
Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee
Abstract:
Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data
Procedia PDF Downloads 37825473 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review
Procedia PDF Downloads 16225472 On the Relation between λ-Symmetries and μ-Symmetries of Partial Differential Equations
Authors: Teoman Ozer, Ozlem Orhan
Abstract:
This study deals with symmetry group properties and conservation laws of partial differential equations. We give a geometrical interpretation of notion of μ-prolongations of vector fields and of the related concept of μ-symmetry for partial differential equations. We show that these are in providing symmetry reduction of partial differential equations and systems and invariant solutions.Keywords: λ-symmetry, μ-symmetry, classification, invariant solution
Procedia PDF Downloads 31925471 Volcanoscape Space Configuration Zoning Based on Disaster Mitigation by Utilizing GIS Platform in Mt. Krakatau Indonesia
Authors: Vega Erdiana Dwi Fransiska, Abyan Rai Fauzan Machmudin
Abstract:
Particularly, space configuration zoning is the very first juncture of a complete space configuration and region planning. Zoning is aimed to define discrete knowledge based on a local wisdom. Ancient predecessor scientifically study the sign of natural disaster towards ethnography approach by operating this knowledge. There are three main functions of space zoning, which are control function, guidance function, and additional function. The control function refers to an instrument for development control and as one of the essentials in controlling land use. Hence, the guidance function indicates as guidance for proposing operational planning and technical development or land usage. Any additional function is useful as a supplementary for region or province planning details. This phase likewise accredits to define boundary in an open space based on geographical appearance. Informant who is categorized as an elder lives in earthquake prone area, to be precise the area is the surrounding of Mount Krakatau. The collected data is one of method for analyzed with thematic model. Later on, it will be verified. In space zoning, long-range distance sensor is applied to determine visualization of the area, which will be zoned before the step of survey to validate the data. The data, which is obtained from long-range distance sensor and site survey, will be overlaid using GIS Platform. Comparing the knowledge based on a local wisdom that is well known by elderly in that area, some of it is relevant to the research, while the others are not. Based on the site survey, the interpretation of a long-range distance sensor, and determining space zoning by considering various aspects resulted in the pattern map of space zoning. This map can be integrated with disaster mitigation affected by volcano eruption.Keywords: elderly, GIS platform, local wisdom, space zoning
Procedia PDF Downloads 25525470 Automatic Reporting System for Transcriptome Indel Identification and Annotation Based on Snapshot of Next-Generation Sequencing Reads Alignment
Authors: Shuo Mu, Guangzhi Jiang, Jinsa Chen
Abstract:
The analysis of Indel for RNA sequencing of clinical samples is easily affected by sequencing experiment errors and software selection. In order to improve the efficiency and accuracy of analysis, we developed an automatic reporting system for Indel recognition and annotation based on image snapshot of transcriptome reads alignment. This system includes sequence local-assembly and realignment, target point snapshot, and image-based recognition processes. We integrated high-confidence Indel dataset from several known databases as a training set to improve the accuracy of image processing and added a bioinformatical processing module to annotate and filter Indel artifacts. Subsequently, the system will automatically generate data, including data quality levels and images results report. Sanger sequencing verification of the reference Indel mutation of cell line NA12878 showed that the process can achieve 83% sensitivity and 96% specificity. Analysis of the collected clinical samples showed that the interpretation accuracy of the process was equivalent to that of manual inspection, and the processing efficiency showed a significant improvement. This work shows the feasibility of accurate Indel analysis of clinical next-generation sequencing (NGS) transcriptome. This result may be useful for RNA study for clinical samples with microsatellite instability in immunotherapy in the future.Keywords: automatic reporting, indel, next-generation sequencing, NGS, transcriptome
Procedia PDF Downloads 19125469 Combination of Geological, Geophysical and Reservoir Engineering Analyses in Field Development: A Case Study
Authors: Atif Zafar, Fan Haijun
Abstract:
A sequence of different Reservoir Engineering methods and tools in reservoir characterization and field development are presented in this paper. The real data of Jin Gas Field of L-Basin of Pakistan is used. The basic concept behind this work is to enlighten the importance of well test analysis in a broader way (i.e. reservoir characterization and field development) unlike to just determine the permeability and skin parameters. Normally in the case of reservoir characterization we rely on well test analysis to some extent but for field development plan, the well test analysis has become a forgotten tool specifically for locations of new development wells. This paper describes the successful implementation of well test analysis in Jin Gas Field where the main uncertainties are identified during initial stage of field development when location of new development well was marked only on the basis of G&G (Geologic and Geophysical) data. The seismic interpretation could not encounter one of the boundary (fault, sub-seismic fault, heterogeneity) near the main and only producing well of Jin Gas Field whereas the results of the model from the well test analysis played a very crucial rule in order to propose the location of second well of the newly discovered field. The results from different methods of well test analysis of Jin Gas Field are also integrated with and supported by other tools of Reservoir Engineering i.e. Material Balance Method and Volumetric Method. In this way, a comprehensive way out and algorithm is obtained in order to integrate the well test analyses with Geological and Geophysical analyses for reservoir characterization and field development. On the strong basis of this working and algorithm, it was successfully evaluated that the proposed location of new development well was not justified and it must be somewhere else except South direction.Keywords: field development plan, reservoir characterization, reservoir engineering, well test analysis
Procedia PDF Downloads 36425468 Assessing P0.1 and Occlusion Pressures in Brain-Injured Patients on Pressure Support Ventilation: A Study Protocol
Authors: S. B. R. Slagmulder
Abstract:
Monitoring inspiratory effort and dynamic lung stress in patients on pressure support ventilation in the ICU is important for protecting against self inflicted lung injury (P-SILI) and diaphragm dysfunction. Strategies to address the detrimental effects of respiratory drive and effort can lead to improved patient outcomes. Two non-invasive estimation methods, occlusion pressure (Pocc) and P0.1, have been proposed for achieving lung and diaphragm protective ventilation. However, their relationship and interpretation in neuro ICU patients is not well understood. P0.1 is the airway pressure measured during a 100-millisecond occlusion of the inspiratory port. It reflects the neural drive from the respiratory centers to the diaphragm and respiratory muscles, indicating the patient's respiratory drive during the initiation of each breath. Occlusion pressure, measured during a brief inspiratory pause against a closed airway, provides information about the inspiratory muscles' strength and the system's total resistance and compliance. Research Objective: Understanding the relationship between Pocc and P0.1 in brain-injured patients can provide insights into the interpretation of these values in pressure support ventilation. This knowledge can contribute to determining extubation readiness and optimizing ventilation strategies to improve patient outcomes. The central goal is to asses a study protocol for determining the relationship between Pocc and P0.1 in brain-injured patients on pressure support ventilation and their ability to predict successful extubation. Additionally, comparing these values between brain-damaged and non-brain-damaged patients may provide valuable insights. Key Areas of Inquiry: 1. How do Pocc and P0.1 values correlate within brain injury patients undergoing pressure support ventilation? 2. To what extent can Pocc and P0.1 values serve as predictive indicators for successful extubation in patients with brain injuries? 3. What differentiates the Pocc and P0.1 values between patients with brain injuries and those without? Methodology: P0.1 and occlusion pressures are standard measurements for pressure support ventilation patients, taken by attending doctors as per protocol. We utilize electronic patient records for existing data. Unpaired T-test will be conducted to compare P0.1 and Pocc values between both study groups. Associations between P0.1 and Pocc and other study variables, such as extubation, will be explored with simple regression and correlation analysis. Depending on how the data evolve, subgroup analysis will be performed for patients with and without extubation failure. Results: While it is anticipated that neuro patients may exhibit high respiratory drive, the linkage between such elevation, quantified by P0.1, and successful extubation remains unknown The analysis will focus on determining the ability of these values to predict successful extubation and their potential impact on ventilation strategies. Conclusion: Further research is pending to fully understand the potential of these indices and their impact on mechanical ventilation in different patient populations and clinical scenarios. Understanding these relationships can aid in determining extubation readiness and tailoring ventilation strategies to improve patient outcomes in this specific patient population. Additionally, it is vital to account for the influence of sedatives, neurological scores, and BMI on respiratory drive and occlusion pressures to ensure a comprehensive analysis.Keywords: brain damage, diaphragm dysfunction, occlusion pressure, p0.1, respiratory drive
Procedia PDF Downloads 6825467 Government Big Data Ecosystem: A Systematic Literature Review
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review
Procedia PDF Downloads 22925466 A Machine Learning Decision Support Framework for Industrial Engineering Purposes
Authors: Anli Du Preez, James Bekker
Abstract:
Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.Keywords: Data analytics, Industrial engineering, Machine learning, Value creation
Procedia PDF Downloads 16825465 Geoelectrical Investigation Around Bomo Area, Kaduna State, Nigeria
Authors: B. S. Jatau, Baba Adama, S. I. Fadele
Abstract:
Electrical resistivity investigation was carried out around Bomo area, Zaria, Kaduna state in order to study the subsurface geologic layer with a view of determining the depth to the bedrock and thickness of the geologic layers. Vertical Electrical Sounding (VES) using Schlumberger array was carried out at fifteen (15) VES stations. ABEM terrameter (SAS 300) was used for the data acquisition. The field data obtained have been analyzed using computer software (IPI2win) which gives an automatic interpretation of the apparent resistivity. The VES results revealed heterogeneous nature of the subsurface geological sequence. The geologic sequence beneath the study area is composed of hard pan top soil (clayey and sandy-lateritic), weathered layer, partly weathered or fractured basement and fresh basement. The resistivity value for the topsoil layer varies from 40Ωm to 450Ωm with thickness ranging from 1.25 to 7.5 m. The weathered basement has resistivity values ranging from 50Ωm to 593Ωm and thickness between 1.37 and 20.1 m. The fractured basement has resistivity values ranging from 218Ωm to 520Ωm and thickness of between 12.9 and 26.3 m. The fresh basement (bedrock) has resistivity values ranging from 1215Ωm to 2150Ωm with infinite depth. However, the depth of the earth’s surface to the bedrock surface varies between 2.63 and 34.99 m. The study further stressed the importance of the findings in civil engineering structures and groundwater prospecting.Keywords: electrical resistivity, CERT (CT), vertical electrical sounding (VES), top soil (TP), weathered basement (WB), partly weathered basement (PWB), fresh basement (FB)
Procedia PDF Downloads 32825464 A Review of Feature Selection Methods Implemented in Neural Stem Cells
Authors: Natasha Petrovska, Mirjana Pavlovic, Maria M. Larrondo-Petrie
Abstract:
Neural stem cells (NSCs) are multi-potent, self-renewing cells that generate new neurons. Three subtypes of NSCs can be separated regarding the stages of NSC lineage: quiescent neural stem cells (qNSCs), activated neural stem cells (aNSCs) and neural progenitor cells (NPCs), but their gene expression signatures are not utterly understood yet. Single-cell examinations have started to elucidate the complex structure of NSC populations. Nevertheless, there is a lack of thorough molecular interpretation of the NSC lineage heterogeneity and an increasing need for tools to analyze and improve the efficiency and correctness of single-cell sequencing data. Feature selection and ordering can identify and classify the gene expression signatures of these subtypes and can discover novel subpopulations during the NSCs activation and differentiation processes. The aim here is to review the implementation of the feature selection technique on NSC subtypes and the classification techniques that have been used for the identification of gene expression signatures.Keywords: feature selection, feature similarity, neural stem cells, genes, feature selection methods
Procedia PDF Downloads 15225463 Feature Analysis of Predictive Maintenance Models
Authors: Zhaoan Wang
Abstract:
Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation
Procedia PDF Downloads 13325462 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm
Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima
Abstract:
In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.Keywords: cloud space, AES, FTP, NetBeans IDE
Procedia PDF Downloads 20625461 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 1925460 Fabrication of a Potential Point-of-Care Device for Hemoglobin A1c: A Lateral Flow Immunosensor
Authors: Shu Hwang Ang, Choo Yee Yu, Geik Yong Ang, Yean Yean Chan, Yatimah Binti Alias, And Sook Mei Khor
Abstract:
With the high prevalence of Type 2 diabetes mellitus across the world, the morbidities and mortalities associated with Type 2 diabetes have significant impact on the production line for a nation. With routine scheduled clinical visits to manage Type 2 diabetes, diabetic patients with hectic lifestyles can have low clinical compliance. Hence, it often decreases the effectiveness of diabetic management personalized for each diabetic patient. Here, we report a useful developed point-of-care (POC) device that detect glycated hemoglobin (HbA1c, biomarker for long-term Type 2 diabetic management). In fact, the established POC devices certified to be used in clinical setting are not only expensive ($ 8 to $10 per test), they also require skillful practitioners to perform sampling and interpretation. As a paper-based biosensor, the developed HbA1c biosensor utilized lateral flow principle to offer an alternative for cost-effective (approximately $2 per test) and end-user friendly device for household testing. Requiring as little as 2 L of finger-picked blood, the test can be performed at the household with just simple dilution and washings. With visual interpretation of numbers of test lines shown on the developed biosensor, it can be interpreted as easy as a urine pregnancy test, aided with scale of intensity provided. In summary, the developed HbA1c immunosensor has been tested to have high selectivity towards HbA1c, and is stable with reasonably good performance in clinical testing. Therefore, our developed HbA1c immunosensor has high potential to be an effective diabetic management tool to increase patient compliance and thus contain the progression of the diabetes.Keywords: blood, glycated hemoglobin (HbA1c), lateral flow, type 2 diabetes mellitus
Procedia PDF Downloads 52825459 Students' Errors in Translating Algebra Word Problems to Mathematical Structure
Authors: Ledeza Jordan Babiano
Abstract:
Translating statements into mathematical notations is one of the processes in word problem-solving. However, based on the literature, students still have difficulties with this skill. The purpose of this study was to investigate the translation errors of the students when they translate algebraic word problems into mathematical structures and locate the errors via the lens of the Translation-Verification Model. Moreover, this qualitative research study employed content analysis. During the data-gathering process, the students were asked to answer a six-item algebra word problem questionnaire, and their answers were analyzed by experts through blind coding using the Translation-Verification Model to determine their translation errors. After this, a focus group discussion was conducted, and the data gathered was analyzed through thematic analysis to determine the causes of the students’ translation errors. It was found out that students’ prevalent error in translation was the interpretation error, which was situated in the Attribute construct. The emerging themes during the FGD were: (1) The procedure of translation is strategically incorrect; (2) Lack of comprehension; (3) Algebra concepts related to difficulty; (4) Lack of spatial skills; (5) Unprepared for independent learning; and (6) The content of the problem is developmentally inappropriate. These themes boiled down to the major concept of independent learning preparedness in solving mathematical problems. This concept has subcomponents, which include contextual and conceptual factors in translation. Consequently, the results provided implications for instructors and professors in Mathematics to innovate their teaching pedagogies and strategies to address translation gaps among students.Keywords: mathematical structure, algebra word problems, translation, errors
Procedia PDF Downloads 4925458 Business Intelligence for Profiling of Telecommunication Customer
Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro
Abstract:
Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.Keywords: business intelligence, customer segmentation, data warehouse, data mining
Procedia PDF Downloads 48425457 Structured-Ness and Contextual Retrieval Underlie Language Comprehension
Authors: Yao-Ying Lai, Maria Pinango, Ashwini Deo
Abstract:
While grammatical devices are essential to language processing, how comprehension utilizes cognitive mechanisms is less emphasized. This study addresses this issue by probing the complement coercion phenomenon: an entity-denoting complement following verbs like begin and finish receives an eventive interpretation. For example, (1) “The queen began the book” receives an agentive reading like (2) “The queen began [reading/writing/etc.…] the book.” Such sentences engender additional processing cost in real-time comprehension. The traditional account attributes this cost to an operation that coerces the entity-denoting complement to an event, assuming that these verbs require eventive complements. However, in closer examination, examples like “Chapter 1 began the book” undermine this assumption. An alternative, Structured Individual (SI) hypothesis, proposes that the complement following aspectual verbs (AspV; e.g. begin, finish) is conceptualized as a structured individual, construed as an axis along various dimensions (e.g. spatial, eventive, temporal, informational). The composition of an animate subject and an AspV such as (1) engenders an ambiguity between an agentive reading along the eventive dimension like (2), and a constitutive reading along the informational/spatial dimension like (3) “[The story of the queen] began the book,” in which the subject is interpreted as a subpart of the complement denotation. Comprehenders need to resolve the ambiguity by searching contextual information, resulting in additional cost. To evaluate the SI hypothesis, a questionnaire was employed. Method: Target AspV sentences such as “Shakespeare began the volume.” were preceded by one of the following types of context sentence: (A) Agentive-biasing, in which an event was mentioned (…writers often read…), (C) Constitutive-biasing, in which a constitutive meaning was hinted (Larry owns collections of Renaissance literature.), (N) Neutral context, which allowed both interpretations. Thirty-nine native speakers of English were asked to (i) rate each context-target sentence pair from a 1~5 scale (5=fully understandable), and (ii) choose possible interpretations for the target sentence given the context. The SI hypothesis predicts that comprehension is harder for the Neutral condition, as compared to the biasing conditions because no contextual information is provided to resolve an ambiguity. Also, comprehenders should obtain the specific interpretation corresponding to the context type. Results: (A) Agentive-biasing and (C) Constitutive-biasing were rated higher than (N) Neutral conditions (p< .001), while all conditions were within the acceptable range (> 3.5 on the 1~5 scale). This suggests that when lacking relevant contextual information, semantic ambiguity decreases comprehensibility. The interpretation task shows that the participants selected the biased agentive/constitutive reading for condition (A) and (C) respectively. For the Neutral condition, the agentive and constitutive readings were chosen equally often. Conclusion: These findings support the SI hypothesis: the meaning of AspV sentences is conceptualized as a parthood relation involving structured individuals. We argue that semantic representation makes reference to spatial structured-ness (abstracted axis). To obtain an appropriate interpretation, comprehenders utilize contextual information to enrich the conceptual representation of the sentence in question. This study connects semantic structure to human’s conceptual structure, and provides a processing model that incorporates contextual retrieval.Keywords: ambiguity resolution, contextual retrieval, spatial structured-ness, structured individual
Procedia PDF Downloads 33325456 Mobile Augmented Reality for Collaboration in Operation
Authors: Chong-Yang Qiao
Abstract:
Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.Keywords: mobile augmented reality, remote collaboration, user experience, cognition model
Procedia PDF Downloads 19725455 Portrayal of Foreign Culture in Pakistani Newspapers
Authors: Ghulam Shabir, Masood Nadeem
Abstract:
The research work has been done on the Portrayal of Foreign Culture including Film, Art, and Drama in Pakistani English newspapers (Dawn and The News). For this purpose the weekly newspapers of three months (January to March) of the years 1990, 1995, 2000, 2005, and 2010 were analyzed. Content Analysis was employed for data interpretation and to draw the inferences. It was explored that to what extent the Foreign Culture has been depicted in our print media in the form of Film, Art, and Drama in comparison to Pakistani cultural context. The qualitative analysis revealed that Pakistani English newspapers gave more coverage to Foreign Culture. Pakistani film, art, and drama related issues have been less portrayed in the form of stories, columns, pictures, and news about music, fashion, ceremonies, programs, and shows. However, most of the space has been occupied by Western and Indian pictures, and news about music, fashion, ceremonies, programs and shows on the Cultural Page of these English newspapers.Keywords: newspapers, portrayal of foreign culture, qualitative analysis, Pakistani English newspapers
Procedia PDF Downloads 511