Search results for: European Court of Human Rights
1431 Comparison of Anti-Shadoo Antibodies – Where is the Endogenous Shadoo protein?
Authors: Eszter Tóth, Ervin Welker
Abstract:
Shadoo protein (Sho) was described in 2003 as the newest member of Prion protein superfamily [1]. Sho has similar structural motifs like prion protein (PrP) that is known for its central role in transmissible spongiform enchephalopathies. Although a great number of functions have been proposed, the exact physiological function of PrP is not known yet. Investigation of the function and localization of Sho may help us to understand the function of the Prion protein superfamily. Analyzing the subcellular localization of YFP-tagged forms of Sho, we detected the protein in the plasma membrane and in the nucleus of various cell lines. To reveal the localization of the endogenous protein we generated antibodies against Shadoo as well as employed commercially available anti-Shadoo antibodies: i) EG62 anti-mouse Shadoo antibody generated by Eurogentec Ltd.; ii) S-12 anti-human Shadoo antibody by Santa Cruz Biotechnology Inc.; iii) R-12 anti-mouse Shadoo antibody by Santa Cruz Biotechnology Inc.; iv) SPRN antibody against human Shadoo by Abgent Inc. We carried out immunocytochemistry on non-transfected HeLa, Zpl 2-1, Zw 3-5, GT1-1, GT1-7 and SHSY5Y cells as well as on YFP-Sho, Sho-YFP, and YFP-GPI transfected HeLa cells. Their specificity (in antibody-peptide competition assay) and co-localization (with the YFP signal) were assessed.
Keywords: Shadoo, prion protein, immunocytochemistry, antibody-peptide competition assay, antibody.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17101430 On the Need to have an Additional Methodology for the Psychological Product Measurement and Evaluation
Authors: Corneliu Sofronie, Roxana Zubcov
Abstract:
Cognitive Science appeared about 40 years ago, subsequent to the challenge of the Artificial Intelligence, as common territory for several scientific disciplines such as: IT, mathematics, psychology, neurology, philosophy, sociology, and linguistics. The new born science was justified by the complexity of the problems related to the human knowledge on one hand, and on the other by the fact that none of the above mentioned sciences could explain alone the mental phenomena. Based on the data supplied by the experimental sciences such as psychology or neurology, models of the human mind operation are built in the cognition science. These models are implemented in computer programs and/or electronic circuits (specific to the artificial intelligence) – cognitive systems – whose competences and performances are compared to the human ones, leading to the psychology and neurology data reinterpretation, respectively to the construction of new models. During these processes if psychology provides the experimental basis, philosophy and mathematics provides the abstraction level utterly necessary for the intermission of the mentioned sciences. The ongoing general problematic of the cognitive approach provides two important types of approach: the computational one, starting from the idea that the mental phenomenon can be reduced to 1 and 0 type calculus operations, and the connection one that considers the thinking products as being a result of the interaction between all the composing (included) systems. In the field of psychology measurements in the computational register use classical inquiries and psychometrical tests, generally based on calculus methods. Deeming things from both sides that are representing the cognitive science, we can notice a gap in psychological product measurement possibilities, regarded from the connectionist perspective, that requires the unitary understanding of the quality – quantity whole. In such approach measurement by calculus proves to be inefficient. Our researches, deployed for longer than 20 years, lead to the conclusion that measuring by forms properly fits to the connectionism laws and principles.Keywords: complementary methodology, connection approach, networks without scaling, quantum psychology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36701429 Model for Knowledge Representation using Sample Problems and Designing a Program for Automatically Solving Algebraic Problems
Authors: Nhon Do, Hien Nguyen
Abstract:
Nowadays there are many methods for representing knowledge such as semantic network, neural network, and conceptual graphs. Nonetheless, these methods are not sufficiently efficient when applied to perform and deduce on knowledge domains about supporting in general education such as algebra, analysis or plane geometry. This leads to the introduction of computational network which is a useful tool for representation knowledge base, especially for computational knowledge, especially knowledge domain about general education. However, when dealing with a practical problem, we often do not immediately find a new solution, but we search related problems which have been solved before and then proposing an appropriate solution for the problem. Besides that, when finding related problems, we have to determine whether the result of them can be used to solve the practical problem or not. In this paper, the extension model of computational network has been presented. In this model, Sample Problems, which are related problems, will be used like the experience of human about practical problem, simulate the way of human thinking, and give the good solution for the practical problem faster and more effectively. This extension model is applied to construct an automatic system for solving algebraic problems in middle school.Keywords: educational software, artificial intelligence, knowledge base system, knowledge representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16711428 Systematic Identification and Quantification of Substrate Specificity Determinants in Human Protein Kinases
Authors: Manuel A. Alonso-Tarajano, Roberto Mosca, Patrick Aloy
Abstract:
Protein kinases participate in a myriad of cellular processes of major biomedical interest. The in vivo substrate specificity of these enzymes is a process determined by several factors, and despite several years of research on the topic, is still far from being totally understood. In the present work, we have quantified the contributions to the kinase substrate specificity of i) the phosphorylation sites and their surrounding residues in the sequence and of ii) the association of kinases to adaptor or scaffold proteins. We have used position-specific scoring matrices (PSSMs), to represent the stretches of sequences phosphorylated by 93 families of kinases. We have found negative correlations between the number of sequences from which a PSSM is generated and the statistical significance and the performance of that PSSM. Using a subset of 22 statistically significant PSSMs, we have identified specificity determinant residues (SDRs) for 86% of the corresponding kinase families. Our results suggest that different SDRs can function as positive or negative elements of substrate recognition by the different families of kinases. Additionally, we have found that human proteins with known function as adaptors or scaffolds (kAS) tend to interact with a significantly large fraction of the substrates of the kinases to which they associate. Based on this characteristic we have identified a set of 279 potential adaptors/scaffolds (pAS) for human kinases, which is enriched in Pfam domains and functional terms tightly related to the proposed function. Moreover, our results show that for 74.6% of the kinase–pAS association found, the pAS colocalize with the substrates of the kinases they are associated to. Finally, we have found evidence suggesting that the association of kinases to adaptors and scaffolds, may contribute significantly to diminish the in vivo substrate crossed-specificity of protein kinases. In general, our results indicate the relevance of several SDRs for both the positive and negative selection of phosphorylation sites by kinase families and also suggest that the association of kinases to pAS proteins may be an important factor for the localization of the enzymes with their set of substrates.
Keywords: Kinase, phosphorylation, substrate specificity, adaptors, scaffolds, cellular colocalization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15351427 Anticancer Effect of Doxorubicin Loaded Heparin based Super-paramagnetic Iron oxide Nanoparticles against the Human Ovarian Cancer Cells
Authors: Amaneh Javid, Shahin Ahmadian, Ali A. Saboury, Saeed Rezaei-Zarchi
Abstract:
This study determines the effect of naked and heparinbased super-paramagnetic iron oxide nanoparticles on the human cancer cell lines of A2780. Doxorubicin was used as the anticancer drug, entrapped in the SPIO-NPs. This study aimed to decorate nanoparticles with heparin, a molecular ligand for 'active' targeting of cancerous cells and the application of modified-nanoparticles in cancer treatment. The nanoparticles containing the anticancer drug DOX were prepared by a solvent evaporation and emulsification cross-linking method. The physicochemical properties of the nanoparticles were characterized by various techniques, and uniform nanoparticles with an average particle size of 110±15 nm with high encapsulation efficiencies (EE) were obtained. Additionally, a sustained release of DOX from the SPIO-NPs was successful. Cytotoxicity tests showed that the SPIO-DOX-HP had higher cell toxicity than the individual HP and confocal microscopy analysis confirmed excellent cellular uptake efficiency. These results indicate that HP based SPIO-NPs have potential uses as anticancer drug carriers and also have an enhanced anticancer effect.Keywords: Heparin, A2780 cells, ovarian cancer, nanoparticles, doxorubicin.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24651426 Planning of Road Infrastructure Financing: Computational Finance Viewpoint
Authors: Ornst J., Voracek J., Allouache A., Allouache D.
Abstract:
Lack of resources for road infrastructure financing is a problem that currently affects not only eastern European economies but also many other countries especially in relation to the impact of global financial crisis. In this context, we are talking about the socalled short-investment problem as a result of long-term lack of investment resources. Based on an analysis of road infrastructure financing in the Czech Republic this article points out at weaknesses of current system and proposes a long-term planning methodology supported by system approach. Within this methodology and using created system dynamic model the article predicts the development of short-investment problem in the Country and in reaction on the downward trend of certain sources the article presents various scenarios resulting from the change of the structure of financial sources. In the discussion the article focuses more closely on the possibility of introduction of tax on vehicles instead of taxes with declining revenue streams and estimates its approximate price in relation to reaching various solutions of short-investment in time.Keywords: Road financing, road infrastructure development, system dynamics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13971425 Using Mixtures of Waste Frying Oil and Pork Lard to Produce Biodiesel
Authors: Joana M. Dias, Conceição A. Ferraz, Manuel F. Almeida
Abstract:
Studying alternative raw materials for biodiesel production is of major importance. The use of mixtures with incorporation of wastes is an environmental friendly alternative and might reduce biodiesel production costs. The objective of the present work was: (i) to study biodiesel production using waste frying oil mixed with pork lard and (ii) to understand how mixture composition influences biodiesel quality. Biodiesel was produced by transesterification and quality was evaluated through determination of several parameters according to EN 14214. The weight fraction of lard in the mixture varied from 0 to 1 in 0.2 intervals. Biodiesel production yields varied from 81.7 to 88.0 (wt%), the lowest yields being the ones obtained using waste frying oil and lard alone as raw materials. The obtained products fulfilled most of the determined quality specifications according to European biodiesel quality standard EN 14214. Minimum purity (96.5 wt%) was closely obtained when waste frying oil was used alone and when 0.2% of lard was incorporated in the raw material (96.3 wt%); however, it ranged from 93.9 to 96.3 (wt%) being always close to the limit. From the evaluation of the influence of mixture composition in biodiesel quality, it was possible to establish a model to be used for predicting some parameters of biodiesel resulting from mixtures of waste frying oil with lard when different lard contents are used.
Keywords: Biodiesel, mixtures, transesterification, waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25491424 A Digital Twin Approach for Sustainable Territories Planning: A Case Study on District Heating
Authors: A. Amrani, O. Allali, A. Ben Hamida, F. Defrance, S. Morland, E. Pineau, T. Lacroix
Abstract:
The energy planning process is a very complex task that involves several stakeholders and requires the consideration of several local and global factors and constraints. In order to optimize and simplify this process, we propose a tool-based iterative approach applied to district heating planning. We build our tool with the collaboration of a French territory using actual district data and implementing the European incentives. We set up an iterative process including data visualization and analysis, identification and extraction of information related to the area concerned by the operation, design of sustainable planning scenarios leveraging local renewable and recoverable energy sources, and finally, the evaluation of scenarios. The last step is performed by a dynamic digital twin replica of the city. Territory’s energy experts confirm that the tool provides them with valuable support towards sustainable energy planning.
Keywords: Climate change, data management, decision support, digital twin, district heating, energy planning, renewables, smart city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6541423 Wasting Human and Computer Resources
Authors: Mária Csernoch, Piroska Biró
Abstract:
The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.
Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19111422 Mapping Crime against Women in India: Spatio-Temporal Analysis, 2001-2012
Authors: Ritvik Chauhan, Vijay Kumar Baraik
Abstract:
Women are most vulnerable to crime despite occupying central position in shaping a society as the first teacher of children. In India too, having equal rights and constitutional safeguards, the incidences of crime against them are large and grave. In this context of crime against women, especially rape has been increasing over time. This paper explores the spatial and temporal aspects of crime against women in India with special reference to rape. It also examines the crime against women with its spatial, socio-economic and demographic associates using related data obtained from the National Crime Records Bureau India, Indian Census and other government sources of the Government of India. The simple statistical, choropleth mapping and other cartographic representation methods have been used to see the crime rates, spatio-temporal patterns of crime, and association of crime with its correlates. The major findings are visible spatial variations across the country and are also in the rising trends in terms of incidence and rates over the reference period. The study also indicates that the geographical associations are somewhat observed. However, selected indicators of socio-economic factors seem to have no significant bearing on crime against women at this level.
Keywords: Crime against women, crime mapping, trend analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17071421 Protection of Cultural Heritage against the Effects of Climate Change Using Autonomous Aerial Systems Combined with Automated Decision Support
Authors: Artur Krukowski, Emmanouela Vogiatzaki
Abstract:
The article presents an ongoing work in research projects such as SCAN4RECO or ARCH, both funded by the European Commission under Horizon 2020 program. The former one concerns multimodal and multispectral scanning of Cultural Heritage assets for their digitization and conservation via spatiotemporal reconstruction and 3D printing, while the latter one aims to better preserve areas of cultural heritage from hazards and risks. It co-creates tools that would help pilot cities to save cultural heritage from the effects of climate change. It develops a disaster risk management framework for assessing and improving the resilience of historic areas to climate change and natural hazards. Tools and methodologies are designed for local authorities and practitioners, urban population, as well as national and international expert communities, aiding authorities in knowledge-aware decision making. In this article we focus on 3D modelling of object geometry using primarily photogrammetric methods to achieve very high model accuracy using consumer types of devices, attractive both to professions and hobbyists alike.
Keywords: 3D modeling, UAS, cultural heritage, preservation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7051420 Automatic Distance Compensation for Robust Voice-based Human-Computer Interaction
Authors: Randy Gomez, Keisuke Nakamura, Kazuhiro Nakadai
Abstract:
Distant-talking voice-based HCI system suffers from performance degradation due to mismatch between the acoustic speech (runtime) and the acoustic model (training). Mismatch is caused by the change in the power of the speech signal as observed at the microphones. This change is greatly influenced by the change in distance, affecting speech dynamics inside the room before reaching the microphones. Moreover, as the speech signal is reflected, its acoustical characteristic is also altered by the room properties. In general, power mismatch due to distance is a complex problem. This paper presents a novel approach in dealing with distance-induced mismatch by intelligently sensing instantaneous voice power variation and compensating model parameters. First, the distant-talking speech signal is processed through microphone array processing, and the corresponding distance information is extracted. Distance-sensitive Gaussian Mixture Models (GMMs), pre-trained to capture both speech power and room property are used to predict the optimal distance of the speech source. Consequently, pre-computed statistic priors corresponding to the optimal distance is selected to correct the statistics of the generic model which was frozen during training. Thus, model combinatorics are post-conditioned to match the power of instantaneous speech acoustics at runtime. This results to an improved likelihood in predicting the correct speech command at farther distances. We experiment using real data recorded inside two rooms. Experimental evaluation shows voice recognition performance using our method is more robust to the change in distance compared to the conventional approach. In our experiment, under the most acoustically challenging environment (i.e., Room 2: 2.5 meters), our method achieved 24.2% improvement in recognition performance against the best-performing conventional method.
Keywords: Human Machine Interaction, Human Computer Interaction, Voice Recognition, Acoustic Model Compensation, Acoustic Speech Enhancement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18851419 Validating Condition-Based Maintenance Algorithms Through Simulation
Authors: Marcel Chevalier, Léo Dupont, Sylvain Marié, Frédérique Roffet, Elena Stolyarova, William Templier, Costin Vasile
Abstract:
Industrial end users are currently facing an increasing need to reduce the risk of unexpected failures and optimize their maintenance. This calls for both short-term analysis and long-term ageing anticipation. At Schneider Electric, we tackle those two issues using both Machine Learning and First Principles models. Machine learning models are incrementally trained from normal data to predict expected values and detect statistically significant short-term deviations. Ageing models are constructed from breaking down physical systems into sub-assemblies, then determining relevant degradation modes and associating each one to the right kinetic law. Validating such anomaly detection and maintenance models is challenging, both because actual incident and ageing data are rare and distorted by human interventions, and incremental learning depends on human feedback. To overcome these difficulties, we propose to simulate physics, systems and humans – including asset maintenance operations – in order to validate the overall approaches in accelerated time and possibly choose between algorithmic alternatives.
Keywords: Degradation models, ageing, anomaly detection, soft sensor, incremental learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3281418 Studying Implication of Globalization on Engineering Education
Authors: S. Sharafi, G. Bassak Harouni, S. Torfi, H. Makenalizadeh, A. Sayahi
Abstract:
The primary purpose of this article is an attempt to find the implication of globalization on education. Globalization has an important role as a process in the economical, political, cultural and technological dimensions in the life of the contemporary human being and has been affected by it. Education has its effects in this procedure and while influencing it through educating global citizens having universal human features and characteristics, has been influenced by this phenomenon too. Nowadays, the role of education is not just to develop in the students the knowledge and skills necessary for the new kinds of jobs. If education wants to help students be prepared of the new global society, it has to make them engaged productive and critical citizens for the global era, so that they can reflect about their roles as key actors in a dynamic often uneven, matrix of economic and cultural exchanges. If education wants to reinforce and raise the national identity, the value system and the children and teenagers, it should make them ready for living in the global era of this century. The used method in this research is documentary and analyzing the documents. Studies in this field show globalization has influences on the processes of the production, distribution and consuming of knowledge. The happening of this event in the information era has not only provide the necessary opportunities for the exchanges of education worldwide but also has privileges for the developing countries which enables them to strengthen educational bases of their society and have an important step toward their future.Keywords: Globalization, Education, global erea
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20801417 Operational Risk – Scenario Analysis
Authors: Milan Rippel, Petr Teply
Abstract:
This paper focuses on operational risk measurement techniques and on economic capital estimation methods. A data sample of operational losses provided by an anonymous Central European bank is analyzed using several approaches. Loss Distribution Approach and scenario analysis method are considered. Custom plausible loss events defined in a particular scenario are merged with the original data sample and their impact on capital estimates and on the financial institution is evaluated. Two main questions are assessed – What is the most appropriate statistical method to measure and model operational loss data distribution? and What is the impact of hypothetical plausible events on the financial institution? The g&h distribution was evaluated to be the most suitable one for operational risk modeling. The method based on the combination of historical loss events modeling and scenario analysis provides reasonable capital estimates and allows for the measurement of the impact of extreme events on banking operations.Keywords: operational risk, scenario analysis, economic capital, loss distribution approach, extreme value theory, stress testing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24291416 Issues in the User Interface Design of a Content Rich Vocational Training Application for Digitally Illiterate Users
Authors: Jamie Otelsberg, Nagarajan Akshay, Rao R. Bhavani
Abstract:
This paper discusses our preliminary experiences in the design of a user interface of a computerized content-rich vocational training courseware meant for users with little or no computer experience. In targeting a growing population with limited access to skills training of any sort, we faced numerous challenges, including language and cultural differences, resource limits, gender boundaries and, in many cases, the simple lack of trainee motivation. With the size of the unskilled population increasing much more rapidly than the numbers of sufficiently skilled teachers, there is little choice but to develop teaching techniques that will take advantage of emerging computer-based training technologies. However, in striving to serve populations with minimal computer literacy, one must carefully design the user interface to accommodate their cultural, social, educational, motivational and other differences. Our work, which uses computer based and haptic simulation technologies to deliver training to these populations, has provided some useful insights on potential user interface design approaches.
Keywords: User interface design, digitally illiterate, vocational training, navigation issues, computer human interaction, human factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23771415 An Integrative Bayesian Approach to Supporting the Prediction of Protein-Protein Interactions: A Case Study in Human Heart Failure
Authors: Fiona Browne, Huiru Zheng, Haiying Wang, Francisco Azuaje
Abstract:
Recent years have seen a growing trend towards the integration of multiple information sources to support large-scale prediction of protein-protein interaction (PPI) networks in model organisms. Despite advances in computational approaches, the combination of multiple “omic" datasets representing the same type of data, e.g. different gene expression datasets, has not been rigorously studied. Furthermore, there is a need to further investigate the inference capability of powerful approaches, such as fullyconnected Bayesian networks, in the context of the prediction of PPI networks. This paper addresses these limitations by proposing a Bayesian approach to integrate multiple datasets, some of which encode the same type of “omic" data to support the identification of PPI networks. The case study reported involved the combination of three gene expression datasets relevant to human heart failure (HF). In comparison with two traditional methods, Naive Bayesian and maximum likelihood ratio approaches, the proposed technique can accurately identify known PPI and can be applied to infer potentially novel interactions.Keywords: Bayesian network, Classification, Data integration, Protein interaction networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16161414 A Supplier-Manufacturer Relationship Model for Teak Forest Carbon Sequestration and Teak Log Demand Fulfillment with Sustainability Consideration
Authors: Ririn Dewi Cahyani, Muh. Hisjam, Wahyudi Sutopo, Kuncoro Harto Widodo
Abstract:
Availability of raw materials is important for Indonesia as a furniture exporting country. Teak log as raw materials is supplied to the furniture industry by Perum Perhutani (PP). PP needs to involve carbon trading for nature conservation. PP also has an obligation in the Corporate Social Responsibility program. PP and furniture industry also must prosecute the regulations related to ecological issues and labor rights. This study has the objective to create the relationship model between supplier and manufacturer to fulfill teak log demand that involving teak forest carbon sequestration. A model is formulated as Goal Programming to get the favorable solution for teak log procurement and support carbon sequestration that considering economical, ecological, and social aspects of both supplier and manufacturer. The results show that the proposed model can be used to determine the teak log quantity involving carbon trading to achieve the seven goals to be satisfied the sustainability considerations.Keywords: Availability of teak log, support carbon sequestration, goal programming, sustainability consideration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17181413 How to Integrate Sustainability in Technological Degrees: Robotics at UPC
Authors: Antoni Grau, Yolanda Bolea, Alberto Sanfeliu
Abstract:
Embedding Sustainability in technological curricula has become a crucial factor for educating engineers with competences in sustainability. The Technical University of Catalonia UPC, in 2008, designed the Sustainable Technology Excellence Program STEP 2015 in order to assure a successful Sustainability Embedding. This Program takes advantage of the opportunity that the redesign of all Bachelor and Master Degrees in Spain by 2010 under the European Higher Education Area framework offered. The STEP program goals are: to design compulsory courses in each degree; to develop the conceptual base and identify reference models in sustainability for all specialties at UPC; to create an internal interdisciplinary network of faculty from all the schools; to initiate new transdisciplinary research activities in technology-sustainability-education; to spread the know/how attained; to achieve international scientific excellence in technology-sustainability-education and to graduate the first engineers/architects of the new EHEA bachelors with sustainability as a generic competence. Specifically, in this paper authors explain their experience in leading the STEP program, and two examples are presented: Industrial Robotics subject and the curriculum for the School of Architecture.
Keywords: Sustainability, curricula improvement, robotics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18581412 Content-Based Image Retrieval Using HSV Color Space Features
Authors: Hamed Qazanfari, Hamid Hassanpour, Kazem Qazanfari
Abstract:
In this paper, a method is provided for content-based image retrieval. Content-based image retrieval system searches query an image based on its visual content in an image database to retrieve similar images. In this paper, with the aim of simulating the human visual system sensitivity to image's edges and color features, the concept of color difference histogram (CDH) is used. CDH includes the perceptually color difference between two neighboring pixels with regard to colors and edge orientations. Since the HSV color space is close to the human visual system, the CDH is calculated in this color space. In addition, to improve the color features, the color histogram in HSV color space is also used as a feature. Among the extracted features, efficient features are selected using entropy and correlation criteria. The final features extract the content of images most efficiently. The proposed method has been evaluated on three standard databases Corel 5k, Corel 10k and UKBench. Experimental results show that the accuracy of the proposed image retrieval method is significantly improved compared to the recently developed methods.
Keywords: Content-based image retrieval, color difference histogram, efficient features selection, entropy, correlation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6601411 A Method for Iris Recognition Based on 1D Coiflet Wavelet
Authors: Agus Harjoko, Sri Hartati, Henry Dwiyasa
Abstract:
There have been numerous implementations of security system using biometric, especially for identification and verification cases. An example of pattern used in biometric is the iris pattern in human eye. The iris pattern is considered unique for each person. The use of iris pattern poses problems in encoding the human iris. In this research, an efficient iris recognition method is proposed. In the proposed method the iris segmentation is based on the observation that the pupil has lower intensity than the iris, and the iris has lower intensity than the sclera. By detecting the boundary between the pupil and the iris and the boundary between the iris and the sclera, the iris area can be separated from pupil and sclera. A step is taken to reduce the effect of eyelashes and specular reflection of pupil. Then the four levels Coiflet wavelet transform is applied to the extracted iris image. The modified Hamming distance is employed to measure the similarity between two irises. This research yields the identification success rate of 84.25% for the CASIA version 1.0 database. The method gives an accuracy of 77.78% for the left eyes of MMU 1 database and 86.67% for the right eyes. The time required for the encoding process, from the segmentation until the iris code is generated, is 0.7096 seconds. These results show that the accuracy and speed of the method is better than many other methods.Keywords: Biometric, iris recognition, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19061410 Cell Phone: A Vital Clue
Authors: Meenakshi Mahajan, Arun Sharma, Navendu Sharma
Abstract:
Increasing use of cell phone as a medium of human interaction is playing a vital role in solving riddles of crime as well. A young girl went missing from her home late in the evening in the month of August, 2008 when her enraged relatives and villagers physically assaulted and chased her fiancée who often frequented her home. Two years later, her mother lodged a complaint against the relatives and the villagers alleging that after abduction her daughter was either sold or killed as she had failed to trace her. On investigation, a rusted cell phone with partial visible IMEI number, clothes, bangles, human skeleton etc. recovered from abandoned well in the month of May, 2011 were examined in the lab. All hopes pinned on identity of cell phone, for only linking evidence to fix the scene of occurrence supported by call detail record (CDR) and to dispel doubts about mode of sudden disappearance or death as DNA technology did not help in establishing identity of the deceased. The conventional scientific methods were used without success and international mobile equipment identification number of the cell phone could be generated by using statistical analysis followed by online verification.
Keywords: Call detail record, Luhn algorithm, stereomicroscope.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19811409 Gold-Mediated Modification of Apoferritin Surface with Targeting Antibodies
Authors: Simona Dostalova, Pavel Kopel, Marketa Vaculovicova, Vojtech Adam, Rene Kizek
Abstract:
To ensure targeting of apoferritin nanocarrier with encapsulated doxorubicin drug, we used a peptide linker based on a protein G with N-terminus affinity towards Fc region of antibodies. To connect the peptide to the surface of apoferritin, the C-terminus of peptide was made of cysteine with affinity to gold. The surface of apoferritin with encapsulated doxorubicin (APODOX) was coated either with gold nanoparticles (APODOX-Nano) or gold(III) chloride hydrate reduced with sodium borohydride (APODOX-HAu). The reduction with sodium borohydride caused a loss of doxorubicin fluorescent properties and probably accompanied with the loss of its biological activity. Fluorescent properties of APODOX-Nano were similar to the unmodified APODOX; therefore it was more suited for the intended use. To evaluate the specificity of apoferritin modified with antibodies, ELISA-like method was used with the surface of microtitration plate wells coated by the antigen (goat anti-human IgG antibodies). To these wells, the nanocarrier was applied. APODOX without the modification showed 5× lower affinity to the antigen than APODOX-Nano modified gold and targeting antibodies (human IgG antibodies).Keywords: Antibody targeting, apoferritin, doxorubicin, nanocarrier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22501408 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment
Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane
Abstract:
Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.
Keywords: Artificial intelligence, computer science, criminal investigation, digital forensics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12921407 Investigating the UAE Residential Valuation System: A Framework for Analysis
Authors: Simon Huston, Ebraheim Lahbash, Ali Parsa
Abstract:
The development of the United Arab Emirates (UAE) into a regional trade, tourism, finance and logistics hub has transformed its real estate markets. However, speculative activity and price volatility remain concerns. UAE residential market values (MV) are exposed to fluctuations in capital flows and migration which, in turn, are affected by geopolitical uncertainty, oil price volatility and global investment market sentiment. Internally, a complex interplay between administrative boundaries, land tenure, building quality and evolving location characteristics fragments UAE residential property markets. In short, the UAE Residential Valuation System (UAE-RVS) confronts multiple challenges to collect, filter and analyze relevant information in complex and dynamic spatial and capital markets. A robust (RVS) can mitigate the risk of unhelpful volatility, speculative excess or investment mistakes. The research outlines the institutional, ontological, dynamic and epistemological issues at play. We highlight the importance of system capabilities, valuation standard salience and stakeholders trust.
Keywords: Valuation, property rights, information, institutions, trust, salience.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23731406 A Comparative Analysis of Machine Learning Techniques for PM10 Forecasting in Vilnius
Authors: M. A. S. Fahim, J. Sužiedelytė Visockienė
Abstract:
With the growing concern over air pollution (AP), it is clear that this has gained more prominence than ever before. The level of consciousness has increased and a sense of knowledge now has to be forwarded as a duty by those enlightened enough to disseminate it to others. This realization often comes after an understanding of how poor air quality indices (AQI) damage human health. The study focuses on assessing air pollution prediction models specifically for Lithuania, addressing a substantial need for empirical research within the region. Concentrating on Vilnius, it specifically examines particulate matter concentrations 10 micrometers or less in diameter (PM10). Utilizing Gaussian Process Regression (GPR) and Regression Tree Ensemble, and Regression Tree methodologies, predictive forecasting models are validated and tested using hourly data from January 2020 to December 2022. The study explores the classification of AP data into anthropogenic and natural sources, the impact of AP on human health, and its connection to cardiovascular diseases. The study revealed varying levels of accuracy among the models, with GPR achieving the highest accuracy, indicated by an RMSE of 4.14 in validation and 3.89 in testing.
Keywords: Air pollution, anthropogenic and natural sources, machine learning, Gaussian process regression, tree ensemble, forecasting models, particulate matter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1171405 Real-time 3D Feature Extraction without Explicit 3D Object Reconstruction
Authors: Kwangjin Hong, Chulhan Lee, Keechul Jung, Kyoungsu Oh
Abstract:
For the communication between human and computer in an interactive computing environment, the gesture recognition is studied vigorously. Therefore, a lot of studies have proposed efficient methods about the recognition algorithm using 2D camera captured images. However, there is a limitation to these methods, such as the extracted features cannot fully represent the object in real world. Although many studies used 3D features instead of 2D features for more accurate gesture recognition, the problem, such as the processing time to generate 3D objects, is still unsolved in related researches. Therefore we propose a method to extract the 3D features combined with the 3D object reconstruction. This method uses the modified GPU-based visual hull generation algorithm which disables unnecessary processes, such as the texture calculation to generate three kinds of 3D projection maps as the 3D feature: a nearest boundary, a farthest boundary, and a thickness of the object projected on the base-plane. In the section of experimental results, we present results of proposed method on eight human postures: T shape, both hands up, right hand up, left hand up, hands front, stand, sit and bend, and compare the computational time of the proposed method with that of the previous methods.Keywords: Fast 3D Feature Extraction, Gesture Recognition, Computer Vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16381404 A Sociological Study of Rural Women Attitudes toward Education, Health and Work outside Home in Beheira Governorate, Egypt
Authors: A. A. Betah
Abstract:
This research was performed to evaluate the attitudes of rural women towards education, health and work outside the home. The study was based on a random sample of 147 rural women, Kafr-Rahmaniyah village was chosen for the study because its life expectancy at birth for females, education and percentage of females in the labor force, were the highest in the district. The study data were collected from rural female respondents, using a face-to-face questionnaire. In addition, the study estimated several factors like age, main occupation, family size, monthly household income, geographic cosmopolites, and degree of social participation for rural women respondents. Using Statistical Package for the Social Sciences (SPSS), data were analyzed by non-parametric statistical methods. The main finding in this study was a significant relationship between each of the previous variables and each of rural women’s attitudes toward education, health, and work outside home. The study concluded with some recommendations. The most important element is ensuring attention to rural women’s needs, requirements and rights via raising their health awareness, education and their contributions in their society.Keywords: Attitudes, education, health, rural women, work outside the home.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10711403 Appraisal of Relativistic Effects on GNSS Receiver Positioning
Authors: I. Yakubu, Y. Y. Ziggah, E. A. Gyamera
Abstract:
The Global Navigation Satellite System (GNSS) started with the launch of the United State Department of Defense Global Positioning System (GPS). GNSS systems has grown over the years to include: GLONASS (Russia); Galileo (European Union); BeiDou (China). Any GNSS architecture consists of three major segments: Space, Control and User Segments. Errors such as; multipath, ionospheric and tropospheric effects, satellite clocks, receiver noise and orbit errors (relativity effect) have significant effects on GNSS positioning. To obtain centimeter level accuracy, the impacts of the relative motion of the satellites and earth need to be taken into account. This paper discusses the relevance of the theory of relativity as a source of error for GNSS receivers for position fix based on available relevant literature. Review of relevant literature reveals that due to relativity; Time dilation, Gravitational frequency shift and Sagnac effect cause significant influence on the use of GNSS receivers for positioning by an error range of ± 2.5 m based on pseudo-range computation.
Keywords: GNSS, relativistic effects, pseudo-range, accuracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3881402 Bearing Capacity of Sheet Hanger Connection to the Trapezoidal Metal Sheet
Authors: Kateřina Jurdová
Abstract:
Hanging to the trapezoidal sheet by decking hanger is a very widespread solution used in civil engineering to lead the distribution of energy, sanitary, air distribution system etc. under the roof or floor structure. The trapezoidal decking hanger is usually a part of the whole installation system for specific distribution medium. The leading companies offer installation systems for each specific distribution e.g. pipe rings, sprinkler systems, installation channels etc. Every specific part is connected to the base connector which is decking hanger. The own connection has three main components: decking hanger, threaded bar with nuts and web of trapezoidal sheet. The aim of this contribution is determinate the failure mechanism of each component in connection. Load bearing capacity of most components in connection could be calculated by formulas in European codes. This contribution is focused on problematic of bearing resistance of threaded bar in web of trapezoidal sheet. This issue is studied by experimental research and numerical modelling. This contribution presented the initial results of experiment which is compared with numerical model of specimen.
Keywords: Decking hanger, concentrated load, connection, load bearing capacity, trapezoidal metal sheet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2649