Search results for: imbalanced data with class overlap
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26404

Search results for: imbalanced data with class overlap

25594 Evaluating the Impact of Expansion on Urban Thermal Surroundings: A Case Study of Lahore Metropolitan City, Pakistan

Authors: Usman Ahmed Khan

Abstract:

Urbanization directly affects the existing infrastructure, landscape modification, environmental contamination, and traffic pollution, especially if there is a lack of urban planning. Recently, the rapid urban sprawl has resulted in less developed green areas and has devastating environmental consequences. This study was aimed to study the past urban expansion rates and measure LST from satellite data. The land use land cover (LULC) maps of years 1996, 2010, 2013, and 2017 were generated using landsat satellite images. Four main classes, i.e., water, urban, bare land, and vegetation, were identified using unsupervised classification with iterative self-organizing data analysis (isodata) technique. The LST from satellite thermal data can be derived from different procedures: atmospheric, radiometric calibrations and surface emissivity corrections, classification of spatial changeability in land-cover. Different methods and formulas were used in the algorithm that successfully retrieves the land surface temperature to help us study the thermal environment of the ground surface. To verify the algorithm, the land surface temperature and the near-air temperature were compared. The results showed that, From 1996-2017, urban areas increased to about a considerable increase of about 48%. Few areas of the city also shown in a reduction in LST from the year 1996-2017 that actually began their transitional phase from rural to urban LULC. The mean temperature of the city increased averagely about 1ºC each year in the month of October. The green and vegetative areas witnessed a decrease in the area while a higher number of pixels increased in urban class.

Keywords: LST, LULC, isodata, urbanization

Procedia PDF Downloads 99
25593 Gender Bias and the Role It Plays in Student Evaluation of Instructors

Authors: B. Garfolo, L. Kelpsh, R. Roak, R. Kuck

Abstract:

Often, student ratings of instructors play a significant role in the career path of an instructor in higher education. So then, how does a student view the effectiveness of instructor teaching? This question has been address by literally thousands of studies found in the literature. Yet, why does this question still persist? A literature review reveals that while it is true that student evaluations of instructors can be biased, there is still a considerable amount of work that needs to be done in understanding why. As student evaluations of instructors can be used in a variety of settings (formative or summative) it is critical to understand the nature of the bias. The authors believe that not only is some bias possible in student evaluations, it should be expected for the simple reason that a student evaluation is a human activity and as such, relies upon perception and interpersonal judgment. As such, student ratings are affected by the same factors that can potentially affect any rater’s judgment, such as stereotypes based on gender, culture, race, etc. Previous study findings suggest that student evaluations of teacher effectiveness differ between male and female raters. However, even though studies have shown that instructor gender does play an important role in influencing student ratings, the exact nature and extent of that role remains the subject of debate. Researchers, in their attempt to define good teaching, have looked for differences in student evaluations based on a variety of characteristics such as course type, class size, ability level of the student and grading practices in addition to instructor and student characteristics (gender, age, etc.) with inconsistent results. If a student evaluation represents more than an instructor’s teaching ability, for example, a physical characteristic such as gender, then this information must be taken into account if the evaluation is to have meaning with respect to instructor assessment. While the authors concede that it is difficult or nearly impossible to separate gender from student perception of teaching practices in person, it is, however, possible to shield an instructor’s gender identity with respect to an online teaching experience. The online teaching modality presents itself as a unique opportunity to experiment directly with gender identity. The analysis of the differences of online behavior of individuals when they perceive that they are interacting with a male or female could provide a wealth of data on how gender influences student perceptions of teaching effectiveness. Given the importance of the role student ratings play in hiring, retention, promotion, tenure, and salary deliberations in academic careers, this question warrants further attention as it is important to be aware of possible bias in student evaluations if they are to be used at all with respect to any academic considerations. For experimental purposes, the author’s constructed and online class where each instructors operate under two different gender identities. In this study, each instructor taught multiple sections of the same class using both a male identity and a female identity. The study examined student evaluations of teaching based on certain student and instructor characteristics in order to determine if and where male and female students might differ in their ratings of instructors based on instructor gender. Additionally, the authors examined if there are differences between undergraduate and graduate students' ratings with respect to the experimental criteria.

Keywords: gender bias, ethics, student evaluations, student perceptions, online instruction

Procedia PDF Downloads 262
25592 Tip60’s Novel RNA-Binding Function Modulates Alternative Splicing of Pre-mRNA Targets Implicated in Alzheimer’s Disease

Authors: Felice Elefant, Akanksha Bhatnaghar, Keegan Krick, Elizabeth Heller

Abstract:

Context: The severity of Alzheimer’s Disease (AD) progression involves an interplay of genetics, age, and environmental factors orchestrated by histone acetyltransferase (HAT) mediated neuroepigenetic mechanisms. While disruption of Tip60 HAT action in neural gene control is implicated in AD, alternative mechanisms underlying Tip60 function remain unexplored. Altered RNA splicing has recently been highlighted as a widespread hallmark in the AD transcriptome that is implicated in the disease. Research Aim: The aim of this study was to identify a novel RNA binding/splicing function for Tip60 in human hippocampus and impaired in brains from AD fly models and AD patients. Methodology/Analysis: The authors used RNA immunoprecipitation using RNA isolated from 200 pooled wild type Drosophila brains for each of the 3 biological replicates. To identify Tip60’s RNA targets, they performed genome sequencing (DNB-SequencingTM technology, BGI genomics) on 3 replicates for Input RNA and RNA IPs by Tip60. Findings: The authors' transcriptomic analysis of RNA bound to Tip60 by Tip60-RNA immunoprecipitation (RIP) revealed Tip60 RNA targets enriched for critical neuronal processes implicated in AD. Remarkably, 79% of Tip60’s RNA targets overlap with its chromatin gene targets, supporting a model by which Tip60 orchestrates bi-level transcriptional regulation at both the chromatin and RNA level, a function unprecedented for any HAT to date. Since RNA splicing occurs co-transcriptionally and splicing defects are implicated in AD, the authors investigated whether Tip60-RNA targeting modulates splicing decisions and if this function is altered in AD. Replicate multivariate analysis of transcript splicing (rMATS) analysis of RNA-Seq data sets from wild-type and AD fly brains revealed a multitude of mammalian-like AS defects. Strikingly, over half of these altered RNAs were bonafide Tip60-RNA targets enriched for in the AD-gene curated database, with some AS alterations prevented against by increasing Tip60 in fly brain. Importantly, human orthologs of several Tip60-modulated spliced genes in Drosophila are well characterized aberrantly spliced genes in human AD brains, implicating disruption of Tip60’s splicing function in AD pathogenesis. Theoretical Importance: The authors' findings support a novel RNA interaction and splicing regulatory function for Tip60 that may underlie AS impairments that hallmark AD etiology. Data Collection: The authors collected data from RNA immunoprecipitation experiments using RNA isolated from 200 pooled wild type Drosophila brains for each of the 3 biological replicates. They also performed genome sequencing (DNBSequencingTM technology, BGI genomics) on 3 replicates for Input RNA and RNA IPs by Tip60. Questions: The question addressed by this study was whether Tip60 has a novel RNA binding/splicing function in human hippocampus and whether this function is impaired in brains from AD fly models and AD patients. Conclusions: The authors' findings support a novel RNA interaction and splicing regulatory function for Tip60 that may underlie AS impairments that hallmark AD etiology.

Keywords: Alzheimer's disease, cognition, aging, neuroepigenetics

Procedia PDF Downloads 70
25591 Optimal Data Selection in Non-Ergodic Systems: A Tradeoff between Estimator Convergence and Representativeness Errors

Authors: Jakob Krause

Abstract:

Past Financial Crisis has shown that contemporary risk management models provide an unjustified sense of security and fail miserably in situations in which they are needed the most. In this paper, we start from the assumption that risk is a notion that changes over time and therefore past data points only have limited explanatory power for the current situation. Our objective is to derive the optimal amount of representative information by optimizing between the two adverse forces of estimator convergence, incentivizing us to use as much data as possible, and the aforementioned non-representativeness doing the opposite. In this endeavor, the cornerstone assumption of having access to identically distributed random variables is weakened and substituted by the assumption that the law of the data generating process changes over time. Hence, in this paper, we give a quantitative theory on how to perform statistical analysis in non-ergodic systems. As an application, we discuss the impact of a paragraph in the last iteration of proposals by the Basel Committee on Banking Regulation. We start from the premise that the severity of assumptions should correspond to the robustness of the system they describe. Hence, in the formal description of physical systems, the level of assumptions can be much higher. It follows that every concept that is carried over from the natural sciences to economics must be checked for its plausibility in the new surroundings. Most of the probability theory has been developed for the analysis of physical systems and is based on the independent and identically distributed (i.i.d.) assumption. In Economics both parts of the i.i.d. assumption are inappropriate. However, only dependence has, so far, been weakened to a sufficient degree. In this paper, an appropriate class of non-stationary processes is used, and their law is tied to a formal object measuring representativeness. Subsequently, that data set is identified that on average minimizes the estimation error stemming from both, insufficient and non-representative, data. Applications are far reaching in a variety of fields. In the paper itself, we apply the results in order to analyze a paragraph in the Basel 3 framework on banking regulation with severe implications on financial stability. Beyond the realm of finance, other potential applications include the reproducibility crisis in the social sciences (but not in the natural sciences) and modeling limited understanding and learning behavior in economics.

Keywords: banking regulation, non-ergodicity, risk management, semimartingale modeling

Procedia PDF Downloads 139
25590 The Impact of Resettlement Challenges in Seeking Employment on the Mental Health and Well-Being of African Refugee Youth in South Australia

Authors: Elvis Munyoka

Abstract:

While the number of African refugees settling in Australia has significantly increased since the mid-1990s, the marginalisation and exclusion of young people from refugee backgrounds in employment remain a critical challenge. Unemployment or underemployment can negatively impact refugees in multiple areas, such as income, housing, life satisfaction, and social status. Higher rates of unemployment among refugees are linked in part to the intersection of pre-migration and daily challenges like trauma, racism, gender identity, and English language competency, all of which generate multiple employability disadvantages. However, the intersection of gender, race, social class, and age in impacting African refugee youth’s access to employment has received less attention. Using a qualitative case study approach, the presentation will explore how gender, race, social class, and age influence African refugee youth graduates’ access to employment in South Australia. The intersectionality theory and capability approach to social justice is used to explore intersecting factors impacting African refugee youth’s access to employment in South Australia. Participants were 16 African refugee graduates aged 18-30 living in South Australia who took part in the study for one year. Based on the trends in the data, the results suggest that long-term unemployment and underemployment, coupled with ongoing racism and marginalisation, have the potential to make refugees more vulnerable to several mental disorders such as depression, hopelessness, and suicidal thoughts. The analysis also reveals that resettlement challenges may limit refugees’ ability to recover from pre-migration trauma. The impact of resettlement challenges on refugee mental health highlights the need for comprehensive policy interventions to address the barriers refugees face in finding employment in resettlement communities. With African refugees constituting such an important part of Australian society, they should have equal access to meaningful employment, as decent work promotes good mental health, successful resettlement, hope, and self-sufficiency.

Keywords: African refugees, employment, mental health, Australia, underemployment

Procedia PDF Downloads 94
25589 Estimation of Heritability and Repeatability for Pre-Weaning Body Weights of Domestic Rabbits Raised in Derived Savanna Zone of Nigeria

Authors: Adewale I. Adeolu, Vivian U. Oleforuh-Okoleh, Sylvester N. Ibe

Abstract:

Heritability and repeatability estimates are needed for the genetic evaluation of livestock populations and consequently for the purpose of upgrading or improvement. Pooled data on 604 progeny from three consecutive parities of purebred rabbit breeds (Chinchilla, Dutch and New Zealand white) raised in Derived Savanna Zone of Nigeria were used to estimate heritability and repeatability for pre-weaning body weights between 1st and 8th week of age. Traits studied include Individual kit weight at birth (IKWB), 2nd week (IK2W), 4th week (IK4W), 6th week (IK6W) and 8th week (IK8W). Nested random effects analysis of (Co)variances as described by Statistical Analysis System (SAS) were employed in the estimation. Respective heritability estimates from the sire component (h2s) and repeatability (R) as intra-class correlations of repeated measurements from the three parties for IKWB, IK2W, IK4W and IK8W are 0.59±0.24, 0.55±0.24, 0.93±0.31, 0.28±0.17, 0.64±0.26 and 0.12±0.14, 0.05±0.14, 0.58±0.02, 0.60±0.11, 0.20±0.14. Heritability and repeatability (except R for IKWB and IK2W) estimates are moderate to high. In conclusion, since pre-weaning body weights in the present study tended to be moderately to highly heritable and repeatable, improvement of rabbits raised in derived savanna zone can be realized through genetic selection criterions.

Keywords: heritability, nested design, parity, pooled data, repeatability

Procedia PDF Downloads 140
25588 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques

Authors: Tosin Ige

Abstract:

Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.

Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique

Procedia PDF Downloads 162
25587 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: big data, big data analytics, Hadoop, cloud

Procedia PDF Downloads 302
25586 Reliability and Validity for Measurement of Body Composition: A Field Method

Authors: Ahmad Hashim, Zarizi Ab Rahman

Abstract:

Measurement of body composition via a field method has the most popular instruments which are used to estimate the percentage of body fat. Among the instruments used are the Body Mass Index, Bio Impedance Analysis and Skinfold Test. All three of these instruments do not involve high costs, do not require high technical skills, are mobile, save time, and are suitable for use in large populations. Because all three instruments can estimate the percentage of body fat, but it is important to identify the most appropriate instruments and have high reliability. Hence, this study was conducted to determine the reliability and convergent validity of the instruments. A total of 40 students, males and females aged between 13 and 14 years participated in this study. The study found that the test retest and Pearson correlation coefficient of reliability for the three instruments is very high, r = .99. While the inter class reliability also are at high level with r = .99 for Body Mass Index and Bio Impedance Analysis, r = .96 for Skin fold test. Intra class reliability coefficient for these three instruments is too high for Body Mass Index r = .99, Bio Impedance Analysis r = .97, and Skin fold Test r = .90. However, Standard Error of Measurement value for all three instruments indicates the Body Mass Index is the most appropriate instrument with a mean value of .000672 compared with other instruments. The findings show that the Body Mass Index is an instrument which is the most accurate and reliable in estimating body fat percentage for the population studied.

Keywords: reliability, validity, body mass index, bio impedance analysis and skinfold test

Procedia PDF Downloads 326
25585 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 143
25584 Constructability Driven Engineering in Oil and Gas Projects

Authors: Srikanth Nagarajan, P. Parthasarathy, Frits Lagers

Abstract:

Lower crude oil prices increased the pressure on oil and gas projects. Being competitive becomes very important and critical for the success in any industry. Increase in size of the project multiplies the magnitude of the issue. Timely completion of projects within the budget and schedule is very important for any project to succeed. A simple idea makes a larger impact on the total cost of the plant. In this robust world, the phases of engineering right from licensing technology, feed, different phases of detail engineering, procurement and construction has been so much compressed that they overlap with each other. Hence constructability techniques have become very important. Here in this paper, the focus will be on how these techniques can be implemented and reduce cost with the help of a case study. Constructability is a process driven by the need to impact project’s construction phase resulting in improved project delivery, costs and schedule. In construction phase of one of our fast-track mega project, it was noticed that there was an opportunity to reduce significant amount of cost and schedule by implementing Constructability study processes. In this case study, the actual methodology adopted during engineering and construction and the way for doing it better by implementing Constructability techniques with collaborative engineering efforts will be explained.

Keywords: being competitive, collaborative engineering, constructability, cost reduction

Procedia PDF Downloads 411
25583 The Impact of Resettlement Challenges in Seeking Employment on the Mental Health and Well-Being of African Refugee Youth in South Australia

Authors: Elvis Munyoka

Abstract:

While the number of African refugees settling in Australia has significantly increased since the mid-1990s, the marginalisation and exclusion of young people from refugee backgrounds in employment remain a critical challenge. Unemployment or underemployment can negatively impact refugees in multiple areas, such as income, housing, life satisfaction, and social status. Higher rates of unemployment among refugees are linked in part to the intersection of pre-migration and daily challenges like trauma, racism, gender identity, and English language competency, all of which generate multiple employability disadvantages. However, the intersection of gender, race, social class, and age in impacting African refugee youth’s access to employment has received less attention. Using a qualitative case study approach, the paper will explore how gender, race, social class, and age influence African refugee youth graduates’ access to employment in South Australia. The intersectionality theory and capability approach to social justice is used to explore intersecting factors impacting African refugee youth’s access to employment in South Australia. Participants were 16 African refugee graduates aged 18-30 living in South Australia who took part in the study for one year. Based on the trends in the data, the results suggest that long-term unemployment and underemployment, coupled with ongoing racism and marginalisation, have the potential to make refugees more vulnerable to several mental disorders such as depression, hopelessness, and suicidal thoughts. The analysis also reveals that resettlement challenges may limit refugees’ ability to recover from pre-migration trauma. The impact of resettlement challenges on refugee mental health highlights the need for comprehensive policy interventions to address the barriers refugees face in finding employment in resettlement communities. With African refugees constituting such an important part of Australian society, they should have equal access to meaningful employment, as decent work promotes good mental health, successful resettlement, hope, and self-sufficiency.

Keywords: African refugee youth, mental health, employment, resettlement, racism

Procedia PDF Downloads 58
25582 Semantic Data Schema Recognition

Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia

Abstract:

The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.

Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns

Procedia PDF Downloads 413
25581 Cumulative Pressure Hotspot Assessment in the Red Sea and Arabian Gulf

Authors: Schröde C., Rodriguez D., Sánchez A., Abdul Malak, Churchill J., Boksmati T., Alharbi, Alsulmi H., Maghrabi S., Mowalad, Mutwalli R., Abualnaja Y.

Abstract:

Formulating a strategy for sustainable development of the Kingdom of Saudi Arabia’s coastal and marine environment is at the core of the “Marine and Coastal Protection Assessment Study for the Kingdom of Saudi Arabia Coastline (MCEP)”; that was set up in the context of the Vision 2030 by the Saudi Arabian government and aimed at providing a first comprehensive ‘Status Quo Assessment’ of the Kingdom’s marine environment to inform a sustainable development strategy and serve as a baseline assessment for future monitoring activities. This baseline assessment relied on scientific evidence of the drivers, pressures and their impact on the environments of the Red Sea and Arabian Gulf. A key element of the assessment was the cumulative pressure hotspot analysis developed for both national waters of the Kingdom following the principles of the Driver-Pressure-State-Impact-Response (DPSIR) framework and using the cumulative pressure and impact assessment methodology. The ultimate goals of the analysis were to map and assess the main hotspots of environmental pressures, and identify priority areas for further field surveillance and for urgent management actions. The study identified maritime transport, fisheries, aquaculture, oil, gas, energy, coastal industry, coastal and maritime tourism, and urban development as the main drivers of pollution in the Saudi Arabian marine waters. For each of these drivers, pressure indicators were defined to spatially assess the potential influence of the drivers on the coastal and marine environment. A list of hotspots of 90 locations could be identified based on the assessment. Spatially grouped the list could be reduced to come up with of 10 hotspot areas, two in the Arabian Gulf, 8 in the Red Sea. The hotspot mapping revealed clear spatial patterns of drivers, pressures and hotspots within the marine environment of waters under KSA’s maritime jurisdiction in the Red Sea and Arabian Gulf. The cascading assessment approach based on the DPSIR framework ensured that the root causes of the hotspot patterns, i.e. the human activities and other drivers, can be identified. The adapted CPIA methodology allowed for the combination of the available data to spatially assess the cumulative pressure in a consistent manner, and to identify the most critical hotspots by determining the overlap of cumulative pressure with areas of sensitive biodiversity. Further improvements are expected by enhancing the data sources of drivers and pressure indicators, fine-tuning the decay factors and distances of the pressure indicators, as well as including trans-boundary pressures across the regional seas.

Keywords: Arabian Gulf, DPSIR, hotspot, red sea

Procedia PDF Downloads 133
25580 Filtering Intrusion Detection Alarms Using Ant Clustering Approach

Authors: Ghodhbani Salah, Jemili Farah

Abstract:

With the growth of cyber attacks, information safety has become an important issue all over the world. Many firms rely on security technologies such as intrusion detection systems (IDSs) to manage information technology security risks. IDSs are considered to be the last line of defense to secure a network and play a very important role in detecting large number of attacks. However the main problem with today’s most popular commercial IDSs is generating high volume of alerts and huge number of false positives. This drawback has become the main motivation for many research papers in IDS area. Hence, in this paper we present a data mining technique to assist network administrators to analyze and reduce false positive alarms that are produced by an IDS and increase detection accuracy. Our data mining technique is unsupervised clustering method based on hybrid ANT algorithm. This algorithm discovers clusters of intruders’ behavior without prior knowledge of a possible number of classes, then we apply K-means algorithm to improve the convergence of the ANT clustering. Experimental results on real dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.

Keywords: intrusion detection system, alarm filtering, ANT class, ant clustering, intruders’ behaviors, false alarms

Procedia PDF Downloads 400
25579 A Study on Ideals and Prime Ideals of Sub-Distributive Semirings and Its Applications to Symmetric Fuzzy Numbers

Authors: Rosy Joseph

Abstract:

From an algebraic point of view, Semirings provide the most natural generalization of group theory and ring theory. In the absence of additive inverse in a semiring, one had to impose a weaker condition on the semiring, i.e., the additive cancellative law to study interesting structural properties. In many practical situations, fuzzy numbers are used to model imprecise observations derived from uncertain measurements or linguistic assessments. In this connection, a special class of fuzzy numbers whose shape is symmetric with respect to a vertical line called the symmetric fuzzy numbers i.e., for α ∈ (0, 1] the α − cuts will have a constant mid-point and the upper end of the interval will be a non-increasing function of α, the lower end will be the image of this function, is suitable. Based on this description, arithmetic operations and a ranking technique to order the symmetric fuzzy numbers were dealt with in detail. Wherein it was observed that the structure of the class of symmetric fuzzy numbers forms a commutative semigroup with cancellative property. Also, it forms a multiplicative monoid satisfying sub-distributive property.In this paper, we introduce the algebraic structure, sub-distributive semiring and discuss its various properties viz., ideals and prime ideals of sub-distributive semiring, sub-distributive ring of difference etc. in detail. Symmetric fuzzy numbers are visualized as an illustration.

Keywords: semirings, subdistributive ring of difference, subdistributive semiring, symmetric fuzzy numbers

Procedia PDF Downloads 204
25578 Access Control System for Big Data Application

Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud

Abstract:

Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.

Keywords: access control, security, Big Data, domain

Procedia PDF Downloads 129
25577 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment

Authors: Michael Gidey Gebru

Abstract:

Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.

Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output

Procedia PDF Downloads 48
25576 Algorithms Inspired from Human Behavior Applied to Optimization of a Complex Process

Authors: S. Curteanu, F. Leon, M. Gavrilescu, S. A. Floria

Abstract:

Optimization algorithms inspired from human behavior were applied in this approach, associated with neural networks models. The algorithms belong to human behaviors of learning and cooperation and human competitive behavior classes. For the first class, the main strategies include: random learning, individual learning, and social learning, and the selected algorithms are: simplified human learning optimization (SHLO), social learning optimization (SLO), and teaching-learning based optimization (TLBO). For the second class, the concept of learning is associated with competitiveness, and the selected algorithms are sports-inspired algorithms (with Football Game Algorithm, FGA and Volleyball Premier League, VPL) and Imperialist Competitive Algorithm (ICA). A real process, the synthesis of polyacrylamide-based multicomponent hydrogels, where some parameters are difficult to obtain experimentally, is considered as a case study. Reaction yield and swelling degree are predicted as a function of reaction conditions (acrylamide concentration, initiator concentration, crosslinking agent concentration, temperature, reaction time, and amount of inclusion polymer, which could be starch, poly(vinyl alcohol) or gelatin). The experimental results contain 175 data. Artificial neural networks are obtained in optimal form with biologically inspired algorithm; the optimization being perform at two level: structural and parametric. Feedforward neural networks with one or two hidden layers and no more than 25 neurons in intermediate layers were obtained with values of correlation coefficient in the validation phase over 0.90. The best results were obtained with TLBO algorithm, correlation coefficient being 0.94 for an MLP(6:9:20:2) – a feedforward neural network with two hidden layers and 9 and 20, respectively, intermediate neurons. Good results obtained prove the efficiency of the optimization algorithms. More than the good results, what is important in this approach is the simulation methodology, including neural networks and optimization biologically inspired algorithms, which provide satisfactory results. In addition, the methodology developed in this approach is general and has flexibility so that it can be easily adapted to other processes in association with different types of models.

Keywords: artificial neural networks, human behaviors of learning and cooperation, human competitive behavior, optimization algorithms

Procedia PDF Downloads 104
25575 Faculty Attendance Management System (FAMS)

Authors: G. C. Almiranez, J. Mercado, L. U. Aumentado, J. M. Mahaguay, J. P. Cruz, M. L. Saballe

Abstract:

This research project focused on the development of an application that aids the university administrators to establish an efficient and effective system in managing faculty attendance and discourage unnecessary absences. The Faculty Attendance Management System (FAMS) is a web based and mobile application which is proven to be efficient and effective in handling and recording data, generating updated reports and analytics needed in managing faculty attendance. The FAMS can facilitate not only a convenient and faster way of gathering and recording of data but it can also provide data analytics, immediate feedback system mechanism and analysis. The software database architecture uses MySQL for web based and SQLite for mobile applications. The system includes different modules that capture daily attendance of faculty members, generate faculty attendance reports and analytics, absences notification system for faculty members, chairperson and dean regarding absences, and immediate communication system concerning the absences incurred. Quantitative and qualitative evaluation showed that the system satisfactory meet the stakeholder’s requirements. The functionality, usability, reliability, performance, and security all turned out to be above average. System testing, integration testing and user acceptance testing had been conducted. Results showed that the system performed very satisfactory and functions as designed. Performance of the system is also affected by Internet infrastructure or connectivity of the university. The faculty analytics generated from the system may not only be used by Deans and Chairperson in their evaluation of faculty performance but as well as the individual faculty to increase awareness on their attendance in class. Hence, the system facilitates effective communication between system stakeholders through FAMS feedback mechanism and up to date posting of information.

Keywords: faculty attendance management system, MySQL, SQLite, FAMS, analytics

Procedia PDF Downloads 432
25574 A QoE-driven Cross-layer Resource Allocation Scheme for High Traffic Service over Open Wireless Network Downlink

Authors: Liya Shan, Qing Liao, Qinyue Hu, Shantao Jiang, Tao Wang

Abstract:

In this paper, a Quality of Experience (QoE)-driven cross-layer resource allocation scheme for high traffic service over Open Wireless Network (OWN) downlink is proposed, and the related problem about the users in the whole cell including the users in overlap region of different cells has been solved.A method, in which assess models of the BestEffort service and the no-reference assess algorithm for video service are adopted, to calculate the Mean Opinion Score (MOS) value for high traffic service has been introduced. The cross-layer architecture considers the parameters in application layer, media access control layer and physical layer jointly. Based on this architecture and the MOS value, the Binary Constrained Particle Swarm Optimization (B_CPSO) algorithm is used to solve the cross-layer resource allocation problem. In addition,simulationresults show that the proposed scheme significantly outperforms other schemes in terms of maximizing average users’ MOS value for the whole system as well as maintaining fairness among users.

Keywords: high traffic service, cross-layer resource allocation, QoE, B_CPSO, OWN

Procedia PDF Downloads 536
25573 Measuring Digital Literacy in the Chilean Workforce

Authors: Carolina Busco, Daniela Osses

Abstract:

The development of digital literacy has become a fundamental element that allows for citizen inclusion, access to quality jobs, and a labor market capable of responding to the digital economy. There are no methodological instruments available in Chile to measure the workforce’s digital literacy and improve national policies on this matter. Thus, the objective of this research is to develop a survey to measure digital literacy in a sample of 200 Chilean workers. Dimensions considered in the instrument are sociodemographics, access to infrastructure, digital education, digital skills, and the ability to use e-government services. To achieve the research objective of developing a digital literacy model of indicators and a research instrument for this purpose, along with an exploratory analysis of data using factor analysis, we used an empirical, quantitative-qualitative, exploratory, non-probabilistic, and cross-sectional research design. The research instrument is a survey created to measure variables that make up the conceptual map prepared from the bibliographic review. Before applying the survey, a pilot test was implemented, resulting in several adjustments to the phrasing of some items. A validation test was also applied using six experts, including their observations on the final instrument. The survey contained 49 items that were further divided into three sets of questions: sociodemographic data; a Likert scale of four values ranked according to the level of agreement; iii) multiple choice questions complementing the dimensions. Data collection occurred between January and March 2022. For the factor analysis, we used the answers to 12 items with the Likert scale. KMO showed a value of 0.626, indicating a medium level of correlation, whereas Bartlett’s test yielded a significance value of less than 0.05 and a Cronbach’s Alpha of 0.618. Taking all factor selection criteria into account, we decided to include and analyze four factors that together explain 53.48% of the accumulated variance. We identified the following factors: i) access to infrastructure and opportunities to develop digital skills at the workplace or educational establishment (15.57%), ii) ability to solve everyday problems using digital tools (14.89%), iii) online tools used to stay connected with others (11.94%), and iv) residential Internet access and speed (11%). Quantitative results were discussed within six focus groups using heterogenic selection criteria related to the most relevant variables identified in the statistical analysis: upper-class school students; middle-class university students; Ph.D. professors; low-income working women, elderly individuals, and a group of rural workers. The digital divide and its social and economic correlations are evident in the results of this research. In Chile, the items that explain the acquisition of digital tools focus on access to infrastructure, which ultimately puts the first filter on the development of digital skills. Therefore, as expressed in the literature review, the advance of these skills is radically different when sociodemographic variables are considered. This increases socioeconomic distances and exclusion criteria, putting those who do not have these skills at a disadvantage and forcing them to seek the assistance of others.

Keywords: digital literacy, digital society, workforce digitalization, digital skills

Procedia PDF Downloads 66
25572 Fast Bayesian Inference of Multivariate Block-Nearest Neighbor Gaussian Process (NNGP) Models for Large Data

Authors: Carlos Gonzales, Zaida Quiroz, Marcos Prates

Abstract:

Several spatial variables collected at the same location that share a common spatial distribution can be modeled simultaneously through a multivariate geostatistical model that takes into account the correlation between these variables and the spatial autocorrelation. The main goal of this model is to perform spatial prediction of these variables in the region of study. Here we focus on a geostatistical multivariate formulation that relies on sharing common spatial random effect terms. In particular, the first response variable can be modeled by a mean that incorporates a shared random spatial effect, while the other response variables depend on this shared spatial term, in addition to specific random spatial effects. Each spatial random effect is defined through a Gaussian process with a valid covariance function, but in order to improve the computational efficiency when the data are large, each Gaussian process is approximated to a Gaussian random Markov field (GRMF), specifically to the block nearest neighbor Gaussian process (Block-NNGP). This approach involves dividing the spatial domain into several dependent blocks under certain constraints, where the cross blocks allow capturing the spatial dependence on a large scale, while each individual block captures the spatial dependence on a smaller scale. The multivariate geostatistical model belongs to the class of Latent Gaussian Models; thus, to achieve fast Bayesian inference, it is used the integrated nested Laplace approximation (INLA) method. The good performance of the proposed model is shown through simulations and applications for massive data.

Keywords: Block-NNGP, geostatistics, gaussian process, GRMF, INLA, multivariate models.

Procedia PDF Downloads 91
25571 Quantitative Analysis of the Quality of Housing and Land Use in the Built-up area of Croatian Coastal City of Zadar

Authors: Silvija Šiljeg, Ante Šiljeg, Branko Cavrić

Abstract:

Housing is considered as a basic human need and important component of the quality of life (QoL) in urban areas worldwide. In contemporary housing studies, the concept of the quality of housing (QoH) is considered as a multi-dimensional and multi-disciplinary field. It emphasizes connection between various aspects of the QoL which could be measured by quantitative and qualitative indicators at different spatial levels (e.g. local, city, metropolitan, regional). The main goal of this paper is to examine the QoH and compare results of quantitative analysis with the clutter land use categories derived for selected local communities in Croatian Coastal City of Zadar. The qualitative housing analysis based on the four housing indicators (out of total 24 QoL indicators) has provided identification of the three Zadar’s local communities with the highest estimated QoH ranking. Furthermore, by using GIS overlay techniques, the QoH was merged with the urban environment analysis and introduction of spatial metrics based on the three categories: the element, class and environment as a whole. In terms of semantic-content analysis, the research has also generated a set of indexes suitable for evaluation of “housing state of affairs” and future decision making aiming at improvement of the QoH in selected local communities.

Keywords: housing, quality, indicators, indexes, urban environment, GIS, element, class

Procedia PDF Downloads 404
25570 Accelerating Malaysian Technology Startups: Case Study of Malaysian Technology Development Corporation as the Innovator

Authors: Norhalim Yunus, Mohamad Husaini Dahalan, Nor Halina Ghazali

Abstract:

Building technology start-ups from ground zero into world-class companies in form and substance present a rare opportunity for government-affiliated institutions in Malaysia. The challenge of building such start-ups becomes tougher when their core businesses involve commercialization of unproven technologies for the mass market. These simple truths, while difficult to execute, will go a long way in getting a business off the ground and flying high. Malaysian Technology Development Corporation (MTDC), a company founded to facilitate the commercial exploitation of R&D findings from research institutions and universities, and eventually help translate these findings of applications in the marketplace, is an excellent case in point. The purpose of this paper is to examine MTDC as an institution as it explores the concept of ‘it takes a village to raise a child’ in an effort to create and nurture start-ups into established world class Malaysian technology companies. With MTDC at the centre of Malaysia's innovative start-ups, the analysis seeks to specifically answer two questions: How has the concept been applied in MTDC? and what can we learn from this successful case? A key aim is to elucidate how MTDC's journey as a private limited company can help leverage reforms and achieve transformation, a process that might be suitable for other small, open, third world and developing countries. This paper employs a single case study, designed to acquire an in-depth understanding of how MTDC has developed and grown technology start-ups to world-class technology companies. The case study methodology is employed as the focus is on a contemporary phenomenon within a real business context. It also explains the causal links in real-life situations where a single survey or experiment is unable to unearth. The findings show that MTDC maximises the concept of it needs a village to raise a child in totality, as MTDC itself assumes the role of the innovator to 'raise' start-up companies into world-class stature. As the innovator, MTDC creates shared value and leadership, introduces innovative programmes ahead of the curve, mobilises talents for optimum results and aggregates knowledge for personnel advancement. The success of the company's effort is attributed largely to leadership, visionary, adaptability, commitment to innovate, partnership and networking, and entrepreneurial drive. The findings of this paper are however limited by the single case study of MTDC. Future research is required to study more cases of success or/and failure where the concept of it takes a village to raise a child have been explored and applied.

Keywords: start-ups, technology transfer, commercialization, technology incubator

Procedia PDF Downloads 140
25569 The Study of Hydro Physical Complex Characteristic of Clay Soil-Ground of Colchis Lowland

Authors: Paata Sitchinava

Abstract:

It has been studied phenomena subjected on the water physical (hydrophysical, mineralogy containing, specific hydrophysical) class of heavy clay soils of the Colchis lowland, according to various categories and forms of the porous water, which will be the base of the distributed used methods of the engineering practice and reclamation effectiveness evaluation. According to of clay grounds data, it has been chosen three research bases section in the central part of lowland, where has implemented investigation works by using a special program. It has been established, that three of cuts are somewhat identical, and by morphological grounds separated layers are the difference by Gallic quality. It has been implemented suitable laboratory experimental research at the samples taken from the cuts, at the base of these created classification mark of physical-technical characteristic, which is the base of suitable calculation of hydrophysical researches.

Keywords: Colchis lowland, drainage, water, soil-ground

Procedia PDF Downloads 176
25568 Quantum Sieving for Hydrogen Isotope Separation

Authors: Hyunchul Oh

Abstract:

One of the challenges in modern separation science and technology is the separation of hydrogen isotopes mixtures since D2 and H2 consist of almost identical size, shape and thermodynamic properties. Recently, quantum sieving of isotopes by confinement in narrow space has been proposed as an alternative technique. Despite many theoretical suggestions, however, it has been difficult to discover a feasible microporous material up to now. Among various porous materials, the novel class of microporous framework materials (COFs, ZIFs and MOFs) is considered as a promising material class for isotope sieving due to ultra-high porosity and uniform pore size which can be tailored. Hence, we investigate experimentally the fundamental correlation between D2/H2 molar ratio and pore size at optimized operating conditions by using different ultramicroporous frameworks. The D2/H2 molar ratio is strongly depending on pore size, pressure and temperature. An experimentally determined optimum pore diameter for quantum sieving lies between 3.0 and 3.4 Å which can be an important guideline for designing and developing feasible microporous frameworks for isotope separation. Afterwards, we report a novel strategy for efficient hydrogen isotope separation at technologically relevant operating pressure through the development of quantum sieving exploited by the pore aperture engineering. The strategy involves installation of flexible components in the pores of the framework to tune the pore surface.

Keywords: gas adsorption, hydrogen isotope, metal organic frameworks(MOFs), quantum sieving

Procedia PDF Downloads 260
25567 Decision Support System for Fetus Status Evaluation Using Cardiotocograms

Authors: Oyebade K. Oyedotun

Abstract:

The cardiotocogram is a technical recording of the heartbeat rate and uterine contractions of a fetus during pregnancy. During pregnancy, several complications can occur to both the mother and the fetus; hence it is very crucial that medical experts are able to find technical means to check the healthiness of the mother and especially the fetus. It is very important that the fetus develops as expected in stages during the pregnancy period; however, the task of monitoring the health status of the fetus is not that which is easily achieved as the fetus is not wholly physically available to medical experts for inspection. Hence, doctors have to resort to some other tests that can give an indication of the status of the fetus. One of such diagnostic test is to obtain cardiotocograms of the fetus. From the analysis of the cardiotocograms, medical experts can determine the status of the fetus, and therefore necessary medical interventions. Generally, medical experts classify examined cardiotocograms into ‘normal’, ‘suspect’, or ‘pathological’. This work presents an artificial neural network based decision support system which can filter cardiotocograms data, producing the corresponding statuses of the fetuses. The capability of artificial neural network to explore the cardiotocogram data and learn features that distinguish one class from the others has been exploited in this research. In this research, feedforward and radial basis neural networks were trained on a publicly available database to classify the processed cardiotocogram data into one of the three classes: ‘normal’, ‘suspect’, or ‘pathological’. Classification accuracies of 87.8% and 89.2% were achieved during the test phase of the trained network for the feedforward and radial basis neural networks respectively. It is the hope that while the system described in this work may not be a complete replacement for a medical expert in fetus status evaluation, it can significantly reinforce the confidence in medical diagnosis reached by experts.

Keywords: decision support, cardiotocogram, classification, neural networks

Procedia PDF Downloads 325
25566 Knowledge, Hierarchy and Decision-Making: Analysis of Documentary Filmmaking Practices in India

Authors: Nivedita Ghosh

Abstract:

In his critique of Lefebvre’s view that ‘technological capacities’ are class-dependent, Francois Hetman argues that technology today is participatory, allowing the entry of individuals from different levels of social stratification. As a result, we are entering into an era of technology operators or ‘clerks’ who become the new decision-makers because of the knowledge they possess of the use of technologies. In response to Hetman’s thesis, this paper argues that knowledge of technology, while indeed providing a momentary space for decision-making, does not necessarily restructure social hierarchies. Through case studies presented from the world of Indian documentary filmmaking, this paper puts forth the view that Hetman’s clerks, despite being technologically advanced, do not break into the filmmaking hierarchical order. This remains true even for a situation where technical knowledge rests most with those in the lowest rungs of the filmmaking ladder. Instead, technological knowledge provides the space for other kinds of relationships to evolve, such as those of ‘trusting the technician’ or ‘admiration for the technician’s work’. Furthermore, what continues to define documentary filmmaking hierarchy is conceptualization capacities of the practitioners, which are influenced by a similarity in socio-cultural backgrounds and film school training accessible primarily to the filmmakers instead of the technicians. Accordingly, the paper concludes with the argument that more than ‘technological-capacities’, it is ‘conceptualization capacities’ which are class-dependent, especially when we study the field of documentary filmmaking.

Keywords: documentary filmmaking, India, technology, knowledge, hierarchy

Procedia PDF Downloads 256
25565 Pibid and Experimentation: A High School Case Study

Authors: Chahad P. Alexandre

Abstract:

PIBID-Institutional Program of Scholarships to Encourage Teaching - is a Brazilian government program that counts today with 48.000 students. It's goal is to motivate the students to stay in the teaching undergraduate programs and to help fill the gap of 100.000 teachers that are needed today in the under graduated schools. The major lack of teachers today is in physics, chemistry, mathematics, and biology. At IFSP-Itapetininga we formatted our physics PIBID based on practical activities. Our students are divided in two São Paulo state government high schools in the same city. The project proposes class activities based on experimentation, observation and understanding of physical phenomena. The didactical experiments are always in relation with the content that the teacher is working, he is the supervisor of the program in the school. Always before an experiment is proposed a little questionnaire to learn about the students preconceptions and one is filled latter to evaluate if now concepts have been created. This procedure is made in order to compare their previous knowledge and how it changed after the experiment is developed. The primary goal of our project is to make the Physics class more attractive to the students and to develop in high school students the interest in learning physics and to show the relation of Physics to the day by day and to the technological world. The objective of the experimental activities is to facilitate the understanding of the concepts that are worked on classes because under experimentation the PIBID scholarship student stimulate the curiosity of the high school student and with this he can develop the capacity to understand and identify the physical phenomena with concrete examples. Knowing how to identify this phenomena and where they are present at the high school student life makes the learning process more significant and pleasant. This proposal make achievable to the students to practice science, to appropriate of complex, in the traditional classes, concepts and overcoming the common preconception that physics is something distant and that is present only on books. This preconception is extremely harmful in the process of scientific knowledge construction. This kind of learning – through experimentation – make the students not only accumulate knowledge but also appropriate it, also to appropriate experimental procedures and even the space that is provided by the school. The PIBID scholarship students, as future teachers also have the opportunity to try experimentation classes, to intervene in the classes and to have contact with their future career. This opportunity allows the students to make important reflection about the practices realized and consequently about the learning methods. Due to this project, we found out that the high school students stay more time focused in the experiment compared to the traditional explanation teachers´ class. As a result in a class, as a participative activity, the students got more involved and participative. We also found out that the physics under graduated students drop out percentage is smaller in our Institute than before the PIBID program started.

Keywords: innovation, projects, PIBID, physics, pre-service teacher experiences

Procedia PDF Downloads 336