Search results for: learning structure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4512

Search results for: learning structure

342 Identifying Teachers’ Perception of Integrity in School-Based Assessment Practice: A Case Study

Authors: Abd Aziz Bin Abd Shukor, Eftah Binti Moh Hj Abdullah

Abstract:

This case study aims to identify teachers’ perception as regards integrity in School-Ba sed Assessment (PBS) practice. This descriptive study involved 9 teachers from 4 secondary schools in 3 districts in the state of Perak. The respondents had undergone an integrity in PBS Practice interview using a focused group discussion method. The overall findings showed that the teachers believed that integrity in PBS practice could be achieved by adjusting the teaching methods align with learning objectives and the students’ characteristics. Many teachers, parents and student did not understand the best practice of PBS. This would affect the integrity in PBS practice. Teachers did not emphasis the principles and ethics. Their integrity as an innovative public servant may also be affected with the frequently changing assessment system, lack of training and no prior action research. The analysis of findings showed that the teachers viewed that organizational integrity involving the integrity of PBS was difficult to be implemented based on the expectations determined by Malaysia Ministry of Education (KPM). A few elements which assisted in the achievement of PBS integrity were the training, students’ understanding, the parents’ understanding of PBS, environment (involving human resources such as support and appreciation and non-human resources such as technology infrastructure readiness and media). The implications of this study show that teachers, as the PBS implementers, have a strong influence on the integrity of PBS. However, the transformation of behavior involving PBS integrity among teachers requires the stabilisation of support and infrastructure in order to enable the teachers to implement PBS in an ethical manner.

Keywords: Assessment integrity, integrity, perception, school-based assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591
341 Morphological Characteristics and Development of the Estuary Area of Lam River, Vietnam

Authors: Hai Nguyen Tien

Abstract:

On the basis of the structure of alluvial sediments explained by echo sounding data and remote sensing images, the following results can be given: The estuary of Lam river from Ben Thuy Bridge (original word: Bến Thủy) to Cua Hoi (original word: Cửa Hội) is divided into three channels (location is calculated according to the river bank on the Nghe An Province, original word: Nghệ An): i) channel I (from Ben Thuy Bridge to Hung Hoa, original word: Hưng Hòa) is the branching river; ii) channel II (from Hung Hoa to Nghi Thai, original word: Nghi Thái)is a channel develops in a meandering direction with a concave side toward Ha Tinh Province (Hà Tĩnh); iii) channel III (from Nghi Thai to Cua Hoi)is a channel develops in a meandering direction with a concave side toward Nghe An province.This estuary area is formed in the period from after the sea level dropped below 0m (current water level) to the present: i) Channel II developed moving towards Ha Tinh Province; ii) Channel III developed moving towards Nghe An Province; iii) In channel I, a second river branch is formed because the flow of river cuts through the Hong Lam- Hong Nhat mudflat (original word: Hồng Lam -Hồng Nhất),at the same time creating an island.Morphological characteristics of the estuary area of Lam River are the main result of erosion and deposition activities corresponding to two water levels: the water level is about 2 m lower than the current water level and the current water level.Characteristics of the sediment layers on the riverbed in the estuary can be used to determine the sea levels in Late Holocene to the present.

Keywords: Lam River, development, Cua Hoi, river morphology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 409
340 FPGA Hardware Implementation and Evaluation of a Micro-Network Architecture for Multi-Core Systems

Authors: Yahia Salah, Med Lassaad Kaddachi, Rached Tourki

Abstract:

This paper presents the design, implementation and evaluation of a micro-network, or Network-on-Chip (NoC), based on a generic pipeline router architecture. The router is designed to efficiently support traffic generated by multimedia applications on embedded multi-core systems. It employs a simplest routing mechanism and implements the round-robin scheduling strategy to resolve output port contentions and minimize latency. A virtual channel flow control is applied to avoid the head-of-line blocking problem and enhance performance in the NoC. The hardware design of the router architecture has been implemented at the register transfer level; its functionality is evaluated in the case of the two dimensional Mesh/Torus topology, and performance results are derived from ModelSim simulator and Xilinx ISE 9.2i synthesis tool. An example of a multi-core image processing system utilizing the NoC structure has been implemented and validated to demonstrate the capability of the proposed micro-network architecture. To reduce complexity of the image compression and decompression architecture, the system use image processing algorithm based on classical discrete cosine transform with an efficient zonal processing approach. The experimental results have confirmed that both the proposed image compression scheme and NoC architecture can achieve a reasonable image quality with lower processing time.

Keywords: Generic Pipeline Network-on-Chip Router Architecture, JPEG Image Compression, FPGA Hardware Implementation, Performance Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3089
339 Estimation of Hysteretic Damping in Steel Dual Systems with Buckling Restrained Brace and Moment Resisting Frame

Authors: Seyed Saeid Tabaee, Omid Bahar

Abstract:

Nowadays, energy dissipation devices are commonly used in structures. High rate of energy absorption during earthquakes is the benefit of using such devices, which results in damage reduction of structural elements, specifically columns. The hysteretic damping capacity of energy dissipation devices is the key point that it may adversely make analysis and design process complicated. This effect may be generally represented by Equivalent Viscous Damping (EVD). The equivalent viscous damping might be obtained from the expected hysteretic behavior regarding to the design or maximum considered displacement of a structure. In this paper, the hysteretic damping coefficient of a steel Moment Resisting Frame (MRF), which its performance is enhanced by a Buckling Restrained Brace (BRB) system has been evaluated. Having foresight of damping fraction between BRB and MRF is inevitable for seismic design procedures like Direct Displacement-Based Design (DDBD) method. This paper presents an approach to calculate the damping fraction for such systems by carrying out the dynamic nonlinear time history analysis (NTHA) under harmonic loading, which is tuned to the natural system frequency. Two MRF structures, one equipped with BRB and the other without BRB are simultaneously studied. Extensive analysis shows that proportion of each system damping fraction may be calculated by its shear story portion. In this way, contribution of each BRB in the floors and their general contribution in the structural performance may be clearly recognized, in advance.

Keywords: Buckling restrained brace, Direct displacement based design, Dual systems, Hysteretic damping, Moment resisting frames.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2459
338 Teaching Ethical Behaviour: Conversational Analysis in Perspective

Authors: Nikhil Kewal Krishna Mehta

Abstract:

In the past researchers have questioned the effectiveness of ethics training in higher education. Also, there are observations that support the view that ethical behaviour (range of actions)/ethical decision making models used in the past make use of vignettes to explain ethical behaviour. The understanding remains in the perspective that these vignettes play a limited role in determining individual intentions and not actions. Some authors have also agreed that there are possibilities of differences in one’s intentions and actions. This paper makes an attempt to fill those gaps by evaluating real actions rather than intentions. In a way this study suggests the use of an experiential methodology to explore Berlo’s model of communication as an action along with orchestration of various principles. To this endeavor, an attempt was made to use conversational analysis in the pursuance of evaluating ethical decision making behaviour among students and middle level managers. The process was repeated six times with the set of an average of 15 participants. Similarities have been observed in the behaviour of students and middle level managers that calls for understanding that both the groups of individuals have no cognizance of their actual actions. The deliberations derived out of conversation were taken a step forward for meta-ethical evaluations to portray a clear picture of ethical behaviour among participants. This study provides insights for understanding demonstrated unconscious human behaviour which may fortuitously be termed both ethical and unethical.

Keywords: Berlo’s action model of communication, Conversational Analysis, Ethical behaviour, Ethical decision making, experiential learning, Intentions and Actions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2534
337 Assessment of the Illustrated Language Activities of the Portage Guide to Early Education

Authors: Ofelia A. Damag

Abstract:

The study was focused on the development and assessment of the illustrated language activities of the 1996 Edition of the Portage Guide to Early Education. It determined the extent of appropriateness, applicability, time efficiency and aesthetics of the illustrated language activities to be used as instructional material not only by teachers, but parents and caregivers as well. The eclectic research design was applied in this study using qualitative and quantitative methods. To determine the applicability and time efficiency of the study, a try out was done. Since the eclectic research design was used, it made use of a researcher-made survey questionnaire and focus group discussion. Analysis of the data was done through weighted mean and ANOVA. The respondents of the study were representatives of Special Education (SPED) teachers, caregivers and parents of a special-needs child, particularly with difficulties in learning basic language skills. The results of the study show that a large number of respondents are SPED teachers and caregivers and are mostly college graduates. Many of them have earned units towards Master’s studies. Moreover, a majority of the respondents have not attended seminars or in-service training in early intervention for them to be more competent in the area of specialization. It is concluded that the illustrated language activities under review in this study are appropriate, applicable, time efficient and aesthetic for use as a tool in teaching. The recommendations are focused on the advocacy for SPED teachers, caregivers and parents of special-needs children to be more consistent in the implementation of the new instructional materials as an aid in an intervention program.

Keywords: Illustrated language activities, inclusion, portage guide to early education, special educational needs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1409
336 Analysis and Control of Camera Type Weft Straightener

Authors: Jae-Yong Lee, Gyu-Hyun Bae, Yun-Soo Chung, Dae-Sub Kim, Jae-Sung Bae

Abstract:

In general, fabric is heat-treated using a stenter machine in order to dry and fix its shape. It is important to shape before the heat treatment because it is difficult to revert back once the fabric is formed. To produce the product of right shape, camera type weft straightener has been applied recently to capture and process fabric images quickly. It is more powerful in determining the final textile quality rather than photo-sensor. Positioning in front of a stenter machine, weft straightener helps to spread fabric evenly and control the angle between warp and weft constantly as right angle by handling skew and bow rollers. To process this tricky procedure, the structural analysis should be carried out in advance, based on which, its control technology can be drawn. A structural analysis is to figure out the specific contact/slippage characteristics between fabric and roller. We already examined the applicability of camera type weft straightener to plain weave fabric and found its possibility and the specific working condition of machine and rollers. In this research, we aimed to explore another applicability of camera type weft straightener. Namely, we tried to figure out camera type weft straightener can be used for fabrics. To find out the optimum condition, we increased the number of rollers. The analysis is done by ANSYS software using Finite Element Analysis method. The control function is demonstrated by experiment. In conclusion, the structural analysis of weft straightener is done to identify a specific characteristic between roller and fabrics. The control of skew and bow roller is done to decrease the error of the angle between warp and weft. Finally, it is proved that camera type straightener can also be used for the special fabrics.

Keywords: Camera type weft straightener, structure analysis, control, skew and bow roller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1439
335 Markov Chain Based QoS Support for Wireless Body Area Network Communication in Health Monitoring Services

Authors: R. A. Isabel, E. Baburaj

Abstract:

Wireless Body Area Networks (WBANs) are essential for real-time health monitoring of patients and in diagnosing of many diseases. WBANs comprise many sensors to monitor a large range of ambient conditions. Quality of Service (QoS) is a key challenge in WBAN, because the different state information of the neighboring nodes has to be monitored in an accurate manner. However, energy consumption gets increased while predicting and maintaining the exact information in highly dynamic environments. In order to reduce energy consumption and end to end delay, Markov Chain Based Quality of Service Support (MC-QoSS) method is designed in the health monitoring services of WBAN communication. The energy consumption gets reduced by forming a Markov chain with high energy nodes in the sensor networks communication path. The low energy level sensor nodes are removed using transitional probability in order to reduce end to end delay. High energy nodes are formed in the chain structure of its corresponding path to enhance communication. After choosing the communication path through high energy nodes, the packets are sent to the sink node from the source node with a higher Packet Delivery Ratio. The simulation result shows that MC-QoSS method improves the packet delivery ratio and reduces energy consumption with minimum end to end delay, compared to existing methods.

Keywords: Wireless body area networks, quality of service, Markov chain, health monitoring services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434
334 Self-Help Adaptation to Flooding in Low-Income Settlements in Chiang Mai, Thailand

Authors: Nachawit Tikul

Abstract:

This study aimed to determine low-income housing adaptations for flooding, which causes living problems and housing damage, and the results from improvement. Three low-income settlements in Chiang Mai which experienced different flood types, i.e. flash floods in Samukeepattana, drainage floods in Bansanku, and river floods in Kampangam, were chosen for the study. Almost all of the residents improved their houses to protect the property from flood damage by changing building materials to flood damage resistant materials for walls, floors, and other parts of the structure that were below the base of annual flood elevation. They could only build some parts of their own homes, so hiring skilled workers or contractors was still important. Building materials which have no need for any special tools and are easy to access and use for construction, as well as low cost, are selected for construction. The residents in the three slums faced living problems for only a short time and were able to cope with them. This may be due to the location of the three slums near the city where assistance is readily available. But the housing and the existence in the slums can endure only the regular floods and residence still have problems in unusual floods, which have been experienced 1-2 times during the past 10 years. The residents accept the need for evacuations and prepare for them. When faced with extreme floods, residence have evacuated to the nearest safe place such as schools and public building, and come back to repair the houses after the flood. These are the distinguishing characteristics of low-income living which can withstand serious situations due to the simple lifestyle. Therefore, preparation of living areas for use during severe floods and encouraging production of affordable flood resistant materials should be areas of concern when formulating disaster assistance policies for low income people.

Keywords: Flooding, low-income settlement, housing, adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1105
333 Social Media Idea Ontology: A Concept for Semantic Search of Product Ideas in Customer Knowledge through User-Centered Metrics and Natural Language Processing

Authors: Martin H¨ausl, Maximilian Auch, Johannes Forster, Peter Mandl, Alexander Schill

Abstract:

In order to survive on the market, companies must constantly develop improved and new products. These products are designed to serve the needs of their customers in the best possible way. The creation of new products is also called innovation and is primarily driven by a company’s internal research and development department. However, a new approach has been taking place for some years now, involving external knowledge in the innovation process. This approach is called open innovation and identifies customer knowledge as the most important source in the innovation process. This paper presents a concept of using social media posts as an external source to support the open innovation approach in its initial phase, the Ideation phase. For this purpose, the social media posts are semantically structured with the help of an ontology and the authors are evaluated using graph-theoretical metrics such as density. For the structuring and evaluation of relevant social media posts, we also use the findings of Natural Language Processing, e. g. Named Entity Recognition, specific dictionaries, Triple Tagger and Part-of-Speech-Tagger. The selection and evaluation of the tools used are discussed in this paper. Using our ontology and metrics to structure social media posts enables users to semantically search these posts for new product ideas and thus gain an improved insight into the external sources such as customer needs.

Keywords: Idea ontology, innovation management, open innovation, semantic search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 776
332 Microbubbles Enhanced Synthetic Phorbol Ester Degradation by Ozonolysis

Authors: Kuvshinov, D., Siswanto, A., Zimmerman, W. B.

Abstract:

A phorbol-12-myristate-13-acetate (TPA) is a synthetic analogue of phorbol ester (PE), a natural toxic compound of Euphorbiaceae plant. The oil extracted from plants of this family is useful source for primarily biofuel. However this oil might also be used as a foodstuff due to its significant nutrition content. The limitations for utilizing the oil as a foodstuff are mainly due to a toxicity of PE. Currently, a majority of PE detoxification processes are expensive as include multi steps alcohol extraction sequence.

Ozone is considered as a strong oxidative agent. It reacts with PE by attacking the carbon-carbon double bond of PE. This modification of PE molecular structure yields a non toxic ester with high lipid content.

This report presents data on development of simple and cheap PE detoxification process with water application as a buffer and ozone as reactive component. The core of this new technique is an application for a new microscale plasma unit to ozone production and the technology permits ozone injection to the water-TPA mixture in form of microbubbles.

The efficacy of a heterogeneous process depends on the diffusion coefficient which can be controlled by contact time and interfacial area. The low velocity of rising microbubbles and high surface to volume ratio allow efficient mass transfer to be achieved during the process. Direct injection of ozone is the most efficient way to process with such highly reactive and short lived chemical.

Data on the plasma unit behavior are presented and the influence of gas oscillation technology on the microbubble production mechanism has been discussed. Data on overall process efficacy for TPA degradation is shown.

Keywords: Microbubble, ozonolysis, synthetic phorbol ester.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2363
331 The Effect of Motor Learning Based Computer-Assisted Practice for Children with Handwriting Deficit – Comparing with the Effect of Traditional Sensorimotor Approach

Authors: Shao-Hsia Chang, Nan-Ying Yu

Abstract:

The objective of this study was to test how advanced digital technology enables a more effective training on the handwriting of children with handwriting deficit. This study implemented the graphomotor apparatuses to a computer-assisted instruction system. In a randomized controlled trial, the experiments for verifying the intervention effect were conducted. Forty two children with handwriting deficit were assigned to computer-assisted instruction, sensorimotor training or control (no intervention) group. Handwriting performance was measured using the Elementary reading/writing test and computerized handwriting evaluation before and after 6 weeks of intervention. Analysis of variance of change scores were conducted to show whether statistically significant difference across the three groups. Significant difference was found among three groups. Computer group shows significant difference from the other two groups. Significance was denoted in near-point, far-point copy, dictation test, and writing from phonetic symbols. Writing speed and mean stroke velocity in near-, far-point and short paragraph copy were found significantly difference among three groups. Computer group shows significant improvement from the other groups. For clinicians and school teachers, the results of this study provide a motor control based insight for the improvement of handwriting difficulties.

Keywords: Dysgraphia, computerized handwriting evaluation, sensorimotor program, computer assisted program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2070
330 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: Natural language processing, end user development; natural language interfaces, human computer interaction, data recognition, dialog systems, spreadsheet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1112
329 RV-YOLOX: Object Detection on Inland Waterways Based on Optimized YOLOX through Fusion of Vision and 3+1D Millimeter Wave Radar

Authors: Zixian Zhang, Shanliang Yao, Zile Huang, Zhaodong Wu, Xiaohui Zhu, Yong Yue, Jieming Ma

Abstract:

Unmanned Surface Vehicles (USVs) hold significant value for their capacity to undertake hazardous and labor-intensive operations over aquatic environments. Object detection tasks are significant in these applications. Nonetheless, the efficacy of USVs in object detection is impeded by several intrinsic challenges, including the intricate dispersal of obstacles, reflections emanating from coastal structures, and the presence of fog over water surfaces, among others. To address these problems, this paper provides a fusion method for USVs to effectively detect objects in the inland surface environment, utilizing vision sensors and 3+1D Millimeter-wave radar. The MMW radar is a complementary tool to vision sensors, offering reliable environmental data. This approach involves the conversion of the radar’s 3D point cloud into a 2D radar pseudo-image, thereby standardizing the format for radar and vision data by leveraging a point transformer. Furthermore, this paper proposes the development of a multi-source object detection network, named RV-YOLOX, which leverages radar-vision integration specifically tailored for inland waterway environments. The performance is evaluated on our self-recording waterways dataset. Compared with the YOLOX network, our fusion network significantly improves detection accuracy, especially for objects with bad light conditions.

Keywords: Inland waterways, object detection, YOLO, sensor fusion, self-attention, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 223
328 Numerical Investigation of Nozzle Shape Effect on Shock Wave in Natural Gas Processing

Authors: Esam I. Jassim, Mohamed M. Awad

Abstract:

Natural gas flow contains undesirable solid particles, liquid condensation, and/or oil droplets and requires reliable removing equipment to perform filtration. Recent natural gas processing applications are demanded compactness and reliability of process equipment. Since conventional means are sophisticated in design, poor in efficiency, and continue lacking robust, a supersonic nozzle has been introduced as an alternative means to meet such demands. A 3-D Convergent-Divergent Nozzle is simulated using commercial Code for pressure ratio (NPR) varies from 1.2 to 2. Six different shapes of nozzle are numerically examined to illustrate the position of shock-wave as such spot could be considered as a benchmark of particle separation. Rectangle, triangle, circular, elliptical, pentagon, and hexagon nozzles are simulated using Fluent Code with all have same cross-sectional area. The simple one-dimensional inviscid theory does not describe the actual features of fluid flow precisely as it ignores the impact of nozzle configuration on the flow properties. CFD Simulation results, however, show that nozzle geometry influences the flow structures including location of shock wave. The CFD analysis predicts shock appearance when p01/pa>1.2 for almost all geometry and locates at the lower area ratio (Ae/At). Simulation results showed that shock wave in Elliptical nozzle has the farthest distance from the throat among the others at relatively small NPR. As NPR increases, hexagon would be the farthest. The numerical result is compared with available experimental data and has shown good agreement in terms of shock location and flow structure.

Keywords: CFD, Particle Separation, Shock wave, Supersonic Nozzle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3239
327 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer

Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved

Abstract:

Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.

Keywords: Computer-aided system, detection, image segmentation, morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 524
326 Evolutionary Origin of the αC Helix in Integrins

Authors: B. Chouhan, A. Denesyuk, J. Heino, M. S. Johnson, K. Denessiouk

Abstract:

Integrins are a large family of multidomain α/β cell signaling receptors. Some integrins contain an additional inserted I domain, whose earliest expression appears to be with the chordates, since they are observed in the urochordates Ciona intestinalis (vase tunicate) and Halocynthia roretzi (sea pineapple), but not in integrins of earlier diverging species. The domain-s presence is viewed as a hallmark of integrins of higher metazoans, however in vertebrates, there are clearly three structurally-different classes: integrins without I domains, and two groups of integrins with I domains but separable by the presence or absence of an additional αC helix. For example, the αI domains in collagen-binding integrins from Osteichthyes (bony fish) and all higher vertebrates contain the specific αC helix, whereas the αI domains in non-collagen binding integrins from vertebrates and the αI domains from earlier diverging urochordate integrins, i.e. tunicates, do not. Unfortunately, within the early chordates, there is an evolutionary gap due to extinctions between the tunicates and cartilaginous fish. This, coupled with a knowledge gap due to the lack of complete genomic data from surviving species, means that the origin of collagen-binding αC-containing αI domains remains unknown. Here, we analyzed two available genomes from Callorhinchus milii (ghost shark/elephant shark; Chondrichthyes – cartilaginous fish) and Petromyzon marinus (sea lamprey; Agnathostomata), and several available Expression Sequence Tags from two Chondrichthyes species: Raja erinacea (little skate) and Squalus acanthias (dogfish shark); and Eptatretus burgeri (inshore hagfish; Agnathostomata), which evolutionary reside between the urochordates and osteichthyes. In P. marinus, we observed several fragments coding for the αC-containing αI domain, allowing us to shed more light on the evolution of the collagen-binding integrins.

Keywords: Integrin αI domain, integrin evolution, collagen binding, structure, αC helix

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3660
325 Knowledge Transfer among Cross-Functional Teams as a Continual Improvement Process

Authors: Sergio Mauricio Pérez López, Luis Rodrigo Valencia Pérez, Juan Manuel Peña Aguilar, Adelina Morita Alexander

Abstract:

The culture of continuous improvement in organizations is very important as it represents a source of competitive advantage. This article discusses the transfer of knowledge between companies which formed cross-functional teams and used a dynamic model for knowledge creation as a framework. In addition, the article discusses the structure of cognitive assets in companies and the concept of "stickiness" (which is defined as an obstacle to the transfer of knowledge). The purpose of this analysis is to show that an improvement in the attitude of individual members of an organization creates opportunities, and that an exchange of information and knowledge leads to generating continuous improvements in the company as a whole. This article also discusses the importance of creating the proper conditions for sharing tacit knowledge. By narrowing gaps between people, mutual trust can be created and thus contribute to an increase in sharing. The concept of adapting knowledge to new environments will be highlighted, as it is essential for companies to translate and modify information so that such information can fit the context of receiving organizations. Adaptation will ensure that the transfer process is carried out smoothly by preventing "stickiness". When developing the transfer process on cross-functional teams (as opposed to working groups), the team acquires the flexibility and responsiveness necessary to meet objectives. These types of cross-functional teams also generate synergy due to the array of different work backgrounds of their individuals. When synergy is established, a culture of continuous improvement is created.

Keywords: Knowledge transfer, continuous improvement, teamwork, cognitive assets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685
324 Localizing and Recognizing Integral Pitches of Cheque Document Images

Authors: Bremananth R., Veerabadran C. S., Andy W. H. Khong

Abstract:

Automatic reading of handwritten cheque is a computationally complex process and it plays an important role in financial risk management. Machine vision and learning provide a viable solution to this problem. Research effort has mostly been focused on recognizing diverse pitches of cheques and demand drafts with an identical outline. However most of these methods employ templatematching to localize the pitches and such schemes could potentially fail when applied to different types of outline maintained by the bank. In this paper, the so-called outline problem is resolved by a cheque information tree (CIT), which generalizes the localizing method to extract active-region-of-entities. In addition, the weight based density plot (WBDP) is performed to isolate text entities and read complete pitches. Recognition is based on texture features using neural classifiers. Legal amount is subsequently recognized by both texture and perceptual features. A post-processing phase is invoked to detect the incorrect readings by Type-2 grammar using the Turing machine. The performance of the proposed system was evaluated using cheque and demand drafts of 22 different banks. The test data consists of a collection of 1540 leafs obtained from 10 different account holders from each bank. Results show that this approach can easily be deployed without significant design amendments.

Keywords: Cheque reading, Connectivity checking, Text localization, Texture analysis, Turing machine, Signature verification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1646
323 Dynamic Threshold Adjustment Approach For Neural Networks

Authors: Hamza A. Ali, Waleed A. J. Rasheed

Abstract:

The use of neural networks for recognition application is generally constrained by their inherent parameters inflexibility after the training phase. This means no adaptation is accommodated for input variations that have any influence on the network parameters. Attempts were made in this work to design a neural network that includes an additional mechanism that adjusts the threshold values according to the input pattern variations. The new approach is based on splitting the whole network into two subnets; main traditional net and a supportive net. The first deals with the required output of trained patterns with predefined settings, while the second tolerates output generation dynamically with tuning capability for any newly applied input. This tuning comes in the form of an adjustment to the threshold values. Two levels of supportive net were studied; one implements an extended additional layer with adjustable neuronal threshold setting mechanism, while the second implements an auxiliary net with traditional architecture performs dynamic adjustment to the threshold value of the main net that is constructed in dual-layer architecture. Experiment results and analysis of the proposed designs have given quite satisfactory conducts. The supportive layer approach achieved over 90% recognition rate, while the multiple network technique shows more effective and acceptable level of recognition. However, this is achieved at the price of network complexity and computation time. Recognition generalization may be also improved by accommodating capabilities involving all the innate structures in conjugation with Intelligence abilities with the needs of further advanced learning phases.

Keywords: Classification, Recognition, Neural Networks, Pattern Recognition, Generalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618
322 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images

Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn

Abstract:

The detection and segmentation of mitochondria from fluorescence microscopy is crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. Although there exists a number of open-source software tools and artificial intelligence (AI) methods designed for analyzing mitochondrial images, the availability of only a few combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compactibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source Python and OpenCV library, the algorithms are implemented in three stages: pre-processing; image binarization; and coarse-to-fine segmentation. The proposed model is validated using the fluorescence mitochondrial dataset. Ground truth labels generated using Labkit were also used to evaluate the performance of our detection and segmentation model using precision, recall and rand index. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks concludes the paper.

Keywords: 2D, Binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 432
321 The Impact of Digital Inclusive Finance on the High-Quality Development of China's Export Trade

Authors: Yao Wu

Abstract:

In the context of financial globalization, China has put forward the policy goal of high-quality development, and the digital economy, with its advantage of information resources, is driving China's export trade to achieve high-quality development. Due to the long-standing financing constraints of small and medium-sized export enterprises, how to expand the export scale of small and medium-sized enterprises has become a major threshold for the development of China's export trade. This paper firstly adopts the hierarchical analysis method to establish the evaluation system of high-quality development of China's export trade; secondly, the panel data of 30 provinces in China from 2011 to 2018 are selected for empirical analysis to establish the impact model of digital inclusive finance on the high-quality development of China's export trade; based on the analysis of the heterogeneous enterprise trade model, a mediating effect model is established to verify the mediating role of credit constraint in the development of high-quality export trade in China. Based on the above analysis, this paper concludes that inclusive digital finance, with its unique digital and inclusive nature, alleviates the credit constraint problem among SMEs, enhances the binary marginal effect of SMEs' exports, optimizes their export scale and structure, and promotes the high-quality development of regional and even national export trade. Finally, based on the findings of this paper, we propose insights and suggestions for inclusive digital finance to promote the high-quality development of export trade.

Keywords: Digital inclusive finance, high-quality development of export trade, fixed effects, binary marginal effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 662
320 Feature Analysis of Predictive Maintenance Models

Authors: Zhaoan Wang

Abstract:

Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.

Keywords: Automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984
319 Novel Adaptive Channel Equalization Algorithms by Statistical Sampling

Authors: János Levendovszky, András Oláh

Abstract:

In this paper, novel statistical sampling based equalization techniques and CNN based detection are proposed to increase the spectral efficiency of multiuser communication systems over fading channels. Multiuser communication combined with selective fading can result in interferences which severely deteriorate the quality of service in wireless data transmission (e.g. CDMA in mobile communication). The paper introduces new equalization methods to combat interferences by minimizing the Bit Error Rate (BER) as a function of the equalizer coefficients. This provides higher performance than the traditional Minimum Mean Square Error equalization. Since the calculation of BER as a function of the equalizer coefficients is of exponential complexity, statistical sampling methods are proposed to approximate the gradient which yields fast equalization and superior performance to the traditional algorithms. Efficient estimation of the gradient is achieved by using stratified sampling and the Li-Silvester bounds. A simple mechanism is derived to identify the dominant samples in real-time, for the sake of efficient estimation. The equalizer weights are adapted recursively by minimizing the estimated BER. The near-optimal performance of the new algorithms is also demonstrated by extensive simulations. The paper has also developed a (Cellular Neural Network) CNN based approach to detection. In this case fast quadratic optimization has been carried out by t, whereas the task of equalizer is to ensure the required template structure (sparseness) for the CNN. The performance of the method has also been analyzed by simulations.

Keywords: Cellular Neural Network, channel equalization, communication over fading channels, multiuser communication, spectral efficiency, statistical sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1510
318 A Questionnaire-Based Survey: Therapist’s Response towards the Upper Limb Disorder Learning Tool

Authors: Noor Ayuni Che Zakaria, Takashi Komeda, Cheng Yee Low, Kaoru Inoue, Fazah Akhtar Hanapiah

Abstract:

Previous studies have shown that there are arguments regarding the reliability and validity of the Ashworth and Modified Ashworth Scale towards evaluating patients diagnosed with upper limb disorders. These evaluations depended on the raters’ experiences. This initiated us to develop an upper limb disorder part-task trainer that is able to simulate consistent upper limb disorders, such as spasticity and rigidity signs, based on the Modified Ashworth Scale to improve the variability occurring between raters and intra-raters themselves. By providing consistent signs, novice therapists would be able to increase training frequency and exposure towards various levels of signs. A total of 22 physiotherapists and occupational therapists participated in the study. The majority of the therapists agreed that with current therapy education, they still face problems with inter-raters and intra-raters variability (strongly agree 54%; n = 12/22, agree 27%; n = 6/22) in evaluating patients’ conditions. The therapists strongly agreed (72%; n = 16/22) that therapy trainees needed to increase their frequency of training; therefore believe that our initiative to develop an upper limb disorder training tool will help in improving the clinical education field (strongly agree and agree 63%; n = 14/22).

Keywords: Upper limb disorders, Clinical education tool, Inter/intra-raters variability, Spasticity, Modified Ashworth Scale.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860
317 Bayes Net Classifiers for Prediction of Renal Graft Status and Survival Period

Authors: Jiakai Li, Gursel Serpen, Steven Selman, Matt Franchetti, Mike Riesen, Cynthia Schneider

Abstract:

This paper presents the development of a Bayesian belief network classifier for prediction of graft status and survival period in renal transplantation using the patient profile information prior to the transplantation. The objective was to explore feasibility of developing a decision making tool for identifying the most suitable recipient among the candidate pool members. The dataset was compiled from the University of Toledo Medical Center Hospital patients as reported to the United Network Organ Sharing, and had 1228 patient records for the period covering 1987 through 2009. The Bayes net classifiers were developed using the Weka machine learning software workbench. Two separate classifiers were induced from the data set, one to predict the status of the graft as either failed or living, and a second classifier to predict the graft survival period. The classifier for graft status prediction performed very well with a prediction accuracy of 97.8% and true positive values of 0.967 and 0.988 for the living and failed classes, respectively. The second classifier to predict the graft survival period yielded a prediction accuracy of 68.2% and a true positive rate of 0.85 for the class representing those instances with kidneys failing during the first year following transplantation. Simulation results indicated that it is feasible to develop a successful Bayesian belief network classifier for prediction of graft status, but not the graft survival period, using the information in UNOS database.

Keywords: Bayesian network classifier, renal transplantation, graft survival period, United Network for Organ Sharing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2099
316 Numerical and Experimental Analyses of a Semi-Active Pendulum Tuned Mass Damper

Authors: H. Juma, F. Al-hujaili, R. Kashani

Abstract:

Modern structures such as floor systems, pedestrian bridges and high-rise buildings have become lighter in mass and more flexible with negligible damping and thus prone to vibration. In this paper, a semi-actively controlled pendulum tuned mass dampers (PTMD) is presented that uses air springs as both the restoring (resilient) and energy dissipating (damping) elements; the tuned mass damper (TMD) uses no passive dampers. The proposed PTMD can readily be fine-tuned and re-tuned, via software, without changing any hardware. Almost all existing semi-active systems have the three elements that passive TMDs have, i.e., inertia, resilient, and dissipative elements with some adjustability built into one or two of these elements. The proposed semi-active air suspended TMD, on the other hand, is made up of only inertia and resilience elements. A notable feature of this TMD is the absence of a physical damping element in its make-up. The required viscous damping is introduced into the TMD using a semi-active control scheme residing in a micro-controller which actuates a high-speed proportional valve regulating the flow of air in and out of the air springs. In addition to introducing damping into the TMD, the semi-active control scheme adjusts the stiffness of the TMD. The focus of this work has been the synthesis and analysis of the control algorithms and strategies to vary the tuning accuracy, introduce damping into air suspended PTMD, and enable the PTMD to self-tune itself. The accelerations of the main structure and PTMD as well as the pressure in the air springs are used as the feedback signals in control strategies. Numerical simulation and experimental evaluation of the proposed tuned damping system are presented in this paper.

Keywords: Tuned mass damper, air spring, semi-active, vibration control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 633
315 Cellular Automata Based Robust Watermarking Architecture towards the VLSI Realization

Authors: V. H. Mankar, T. S. Das, S. K. Sarkar

Abstract:

In this paper, we have proposed a novel blind watermarking architecture towards its hardware implementation in VLSI. In order to facilitate this hardware realization, cellular automata (CA) concept is introduced. The CA has been already accepted as an attractive structure for VLSI implementation because of its modularity, parallelism, high performance and reliability. The hardware realizable multiresolution spread spectrum watermarking techniques are very few in numbers in spite of their best ever resiliency against signal impairments. This is because of the computational cost and complexity associated with their different filter banks and lifting techniques. The concept of cellular automata theory in order to form a new transform domain technique i.e. Cellular Automata Transform (CAT) have been incorporated. Since CA provides spreading sequences having very low cross-correlation properties, the CA based pseudorandom sequence generator is considered in the present work. Considering the watermarking technique as a digital communication process, an error control coding (ECC) must be incorporated in the data hiding schemes. Besides the hardware implementation of entire CA based data hiding technique, the individual blocks of the algorithm using CA provide the best result than that of some other methods irrespective of the hardware and software technique. The Cellular Automata Transform, CA based PN sequence generator, and CA ECC are the requisite blocks that are developed not only to meet the reliable hardware requirements but also for the basic spread spectrum watermarking features. The proposed algorithm shows statistical invisibility and resiliency against various common signal-processing operations. This algorithmic design utilizes the existing allocated bandwidth in the data transmission channel in a more efficient manner.

Keywords: Cellular automata, watermarking, error control coding, PN sequence, VLSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2058
314 Stabilization of Transition Metal Chromite Nanoparticles in Silica Matrix

Authors: Jiri Plocek, Petr Holec, Simona Kubickova, Barbara Pacakova, Irena Matulkova, Alice Mantlikova, Ivan Nemec, Daniel Niznansky, Jana Vejpravova

Abstract:

This article presents summary on preparation and characterization of zinc, copper, cadmium and cobalt chromite nanocrystals, embedded in an amorphous silica matrix. The ZnCr2O4/SiO2, CuCr2O4/SiO2, CdCr2O4/SiO2 and CoCr2O4/SiO2 nanocomposites were prepared by a conventional sol-gel method under acid catalysis. Final heat treatment of the samples was carried out at temperatures in the range of 900−1200 ◦C to adjust the phase composition and the crystallite size, respectively. The resulting samples were characterized by Powder X-ray diffraction (PXRD), High Resolution Transmission Electron Microscopy (HRTEM), Raman/FTIR spectroscopy and magnetic measurements. Formation of the spinel phase was confirmed in all samples. The average size of the nanocrystals was determined from the PXRD data and by direct particle size observation on HRTEM; both results were correlated. The mean particle size (reviewed by HRTEM) was in the range from ∼4 to 46 nm. The results showed that the sol-gel method can be effectively used for preparation of the spinel chromite nanoparticles embedded in the silica matrix and the particle size is driven by the type of the cation A2+ in the spinel structure and the temperature of the final heat treatment. Magnetic properties of the nanocrystals were found to be just moderately modified in comparison to the bulk phases.

Keywords: Chromite, Fourier transform infrared spectroscopy, agnetic properties, nanocomposites, Raman spectroscopy, Rietveld refinement, sol-gel method, spinel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2808
313 Gender Differences in Biology Academic Performances among Foundation Students of PERMATApintar® National Gifted Center

Authors: N. Nor Azman, M. F. Kamarudin, S. I. Ong, N. Maaulot

Abstract:

PERMATApintar® National Gifted Center is, to the author’s best of knowledge, the first center in Malaysia that provides a platform for Malaysian talented students with high ability in thinking. This center has built a teaching and learning biology curriculum that suits the ability of these gifted students. The level of PERMATApintar® biology curriculum is basically higher than the national biology curriculum. Here, the foundation students are exposed to the PERMATApintar® biology curriculum at the age of as early as 11 years old. This center practices a 4-time-a-year examination system to monitor the academic performances of the students. Generally, most of the time, male students show no or low interest towards biology subject compared to female students. This study is to investigate the association of students’ gender and their academic performances in biology examination. A total of 39 students’ scores in twelve sets of biology examinations in 3 years have been collected and analyzed by using the statistical analysis. Based on the analysis, there are no significant differences between male and female students against the biology academic performances with a significant level of p = 0.05. This indicates that gender is not associated with the scores of biology examinations among the students. Another result showed that the average score for male studenta was higher than the female students. Future research can be done by comparing the biology academic achievement in Malaysian National Examination (Sijil Pelajaran Malaysia, SPM) between the Foundation 3 students (Grade 9) and Level 2 students (Grade 11) with similar PERMATApintar® biology curriculum.

Keywords: Academic performances, biology, gender differences, gifted students.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1269