Search results for: platform specific model(PSM)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9180

Search results for: platform specific model(PSM)

8130 Adequacy of Advanced Earthquake Intensity Measures for Estimation of Damage under Seismic Excitation with Arbitrary Orientation

Authors: Konstantinos G. Kostinakis, Manthos K. Papadopoulos, Asimina M. Athanatopoulou

Abstract:

An important area of research in seismic risk analysis is the evaluation of expected seismic damage of structures under a specific earthquake ground motion. Several conventional intensity measures of ground motion have been used to estimate their damage potential to structures. Yet, none of them was proved to be able to predict adequately the seismic damage of any structural system. Therefore, alternative advanced intensity measures which take into account not only ground motion characteristics but also structural information have been proposed. The adequacy of a number of advanced earthquake intensity measures in prediction of structural damage of 3D R/C buildings under seismic excitation which attacks the building with arbitrary incident angle is investigated in the present paper. To achieve this purpose, a symmetric in plan and an asymmetric 5-story R/C building are studied. The two buildings are subjected to 20 bidirectional earthquake ground motions. The two horizontal accelerograms of each ground motion are applied along horizontal orthogonal axes forming 72 different angles with the structural axes. The response is computed by non-linear time history analysis. The structural damage is expressed in terms of the maximum interstory drift as well as the overall structural damage index. The values of the aforementioned seismic damage measures determined for incident angle 0° as well as their maximum values over all seismic incident angles are correlated with 9 structure-specific ground motion intensity measures. The research identified certain intensity measures which exhibited strong correlation with the seismic damage of the two buildings. However, their adequacy for estimation of the structural damage depends on the response parameter adopted. Furthermore, it was confirmed that the widely used spectral acceleration at the fundamental period of the structure is a good indicator of the expected earthquake damage level.

Keywords: damage indices, non-linear response, seismic excitation angle, structure-specific intensity measures

Procedia PDF Downloads 484
8129 Enhancing Anode Performance in Li-S Batteries via Coating with Waste Battery-Derived Materials

Authors: Mohsen Hajian Foroushani, Samane Maroufi, Rasoul Khayyam Nekouei, Veena Sahajwalla

Abstract:

Lithium (Li) metal possesses outstanding characteristics, with the highest specific capacity (3860 mAh g-1) and the lowest electrochemical potential (-3.04 V vs. SHE) among available metal anodes. The collaborative impact of Li and sulfur, featuring a specific capacity of 1670 mAh g-1, positions Li–S batteries (LSBs) as highly promising contenders for the next generation of high-energy-density batteries. However, the comprehensive commercialization of LSBs relies on addressing various challenges inherent to these batteries. One of the most formidable hurdles is the widespread issue of Li dendrite nucleation and growth on the anode surface, stemming from the inherent instability of the solid electrolyte interphase (SEI) layer. In this study, we employed a Zn-based coating derived from waste materials, significantly enhancing the performance of the symmetrical cell across various current densities. The applied coating not only improved the cyclability of the cell by more than fourfold but also reduced the charge transfer resistance from over 300 to less than 10 before cycling. Examination through SEM micrographs of both samples revealed the successful suppression of Li dendrites by the applied coating.

Keywords: Li-S batteries, Li dendrite, sustainability, Li anode

Procedia PDF Downloads 56
8128 FPGA Implementation of RSA Encryption Algorithm for E-Passport Application

Authors: Khaled Shehata, Hanady Hussien, Sara Yehia

Abstract:

Securing the data stored on E-passport is a very important issue. RSA encryption algorithm is suitable for such application with low data size. In this paper the design and implementation of 1024 bit-key RSA encryption and decryption module on an FPGA is presented. The module is verified through comparing the result with that obtained from MATLAB tools. The design runs at a frequency of 36.3 MHz on Virtex-5 Xilinx FPGA. The key size is designed to be 1024-bit to achieve high security for the passport information. The whole design is achieved through VHDL design entry which makes it a portable design and can be directed to any hardware platform.

Keywords: RSA, VHDL, FPGA, modular multiplication, modular exponential

Procedia PDF Downloads 375
8127 Cognitive eTransformation Framework for Education Sector

Authors: A. Hol

Abstract:

21st century brought waves of business and industry eTransformations. The impact of change is also being seen in education. To identify the extent of this, scenario analysis methodology was utilised with the aim to assess business transformations across industry sectors ranging from craftsmanship, medicine, finance and manufacture to innovations and adoptions of new technologies and business models. Firstly, scenarios were drafted based on the current eTransformation models and its dimensions. Following this, eTransformation framework was utilised with the aim to derive the key eTransformation parameters, the essential characteristics that have enabled eTransformations across the sectors. Following this, identified key parameters were mapped to the transforming domain-education. The mapping assisted in deriving a cognitive eTransformation framework for education sector. The framework highlights the importance of context and the notion that education today needs not only to deliver content to students but it also needs to be able to meet the dynamically changing demands of specific student and industry groups. Furthermore, it pinpoints that for such processes to be supported, specific technology is required, so that instant, on demand and periodic feedback as well as flexible, dynamically expanding study content can be sought and received via multiple education mediums.

Keywords: education sector, business transformation, eTransformation model, cognitive model, cognitive systems, eTransformation

Procedia PDF Downloads 122
8126 MB-Slam: A Slam Framework for Construction Monitoring

Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han

Abstract:

Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.

Keywords: perspective alignment, progress monitoring, slam, stereo matching.

Procedia PDF Downloads 203
8125 Bending Effect on POF Splitter Performance for Different Thickness of Fiber Cores

Authors: L. S. Supian, Mohd Syuhaimi Ab-Rahman, Norhana Arsad

Abstract:

Experimental study has been done to study the performance on polymer optical fiber splitter characterization when different bending radii are applied on splitters with different fiber cores. The splitters with different cores pair are attached successively to splitter platform of ellipse-shape geometrical blocks of several bending radii. A force is exerted upon the blocks thus the splitter in order to encourage the splitting of energy between the two fibers. The aim of this study is to investigate which fiber core pair gives the optimum performance that goes with each bending radius in order to develop an effective splitter.

Keywords: splitter, macro-bending, cores, geometrical blocks

Procedia PDF Downloads 654
8124 Investigation of the Properties of Biochar Obtained by Dry and Wet Torrefaction in a Fixed and in a Fluidized Bed

Authors: Natalia Muratova, Dmitry Klimov, Rafail Isemin, Sergey Kuzmin, Aleksandr Mikhalev, Oleg Milovanov

Abstract:

We investigated the processing of poultry litter into biochar using dry torrefaction methods (DT) in a fixed and fluidized bed of quartz sand blown with nitrogen, as well as wet torrefaction (WT) in a fluidized bed in a medium of water steam at a temperature of 300 °C. Torrefaction technology affects the duration of the heat treatment process and the characteristics of the biochar: the process of separating CO₂, CO, H₂ and CH₄ from a portion of fresh poultry litter during torrefaction in a fixed bed is completed after 2400 seconds, but in a fluidized bed — after 480 seconds. During WT in a fluidized bed of quartz sand, this process ends in 840 seconds after loading a portion of fresh litter, but in a fluidized bed of litter particles previously subjected to torrefaction, the process ends in 350 - 450 seconds. In terms of the ratio between (H/C) and (O/C), the litter obtained after DT and WT treatment corresponds to lignite. WT in a fluidized bed allows one to obtain biochar, in which the specific pore area is two times larger than the specific pore area of biochar obtained after DT in a fluidized bed. Biochar, obtained as a result of the poultry litter treatment in a fluidized bed using DT or WT method, is recommended to be used not only as a biofuel but also as an adsorbent or the soil fertilizer.

Keywords: biochar, poultry litter, dry and wet torrefaction, fixed bed, fluidized bed

Procedia PDF Downloads 136
8123 Environmental Related Mortality Rates through Artificial Intelligence Tools

Authors: Stamatis Zoras, Vasilis Evagelopoulos, Theodoros Staurakas

Abstract:

The association between elevated air pollution levels and extreme climate conditions (temperature, particulate matter, ozone levels, etc.) and mental consequences has been, recently, the focus of significant number of studies. It varies depending on the time of the year it occurs either during the hot period or cold periods but, specifically, when extreme air pollution and weather events are observed, e.g. air pollution episodes and persistent heatwaves. It also varies spatially due to different effects of air quality and climate extremes to human health when considering metropolitan or rural areas. An air pollutant concentration and a climate extreme are taking a different form of impact if the focus area is countryside or in the urban environment. In the built environment the climate extreme effects are driven through the formed microclimate which must be studied more efficiently. Variables such as biological, age groups etc may be implicated by different environmental factors such as increased air pollution/noise levels and overheating of buildings in comparison to rural areas. Gridded air quality and climate variables derived from the land surface observations network of West Macedonia in Greece will be analysed against mortality data in a spatial format in the region of West Macedonia. Artificial intelligence (AI) tools will be used for data correction and prediction of health deterioration with climatic conditions and air pollution at local scale. This would reveal the built environment implications against the countryside. The air pollution and climatic data have been collected from meteorological stations and span the period from 2000 to 2009. These will be projected against the mortality rates data in daily, monthly, seasonal and annual grids. The grids will be operated as AI-based warning models for decision makers in order to map the health conditions in rural and urban areas to ensure improved awareness of the healthcare system by taken into account the predicted changing climate conditions. Gridded data of climate conditions, air quality levels against mortality rates will be presented by AI-analysed gridded indicators of the implicated variables. An Al-based gridded warning platform at local scales is then developed for future system awareness platform for regional level.

Keywords: air quality, artificial inteligence, climatic conditions, mortality

Procedia PDF Downloads 95
8122 Effectiveness of Impairment Specified Muscle Strengthening Programme in a Group of Disabled Athletes

Authors: A. L. I. Prasanna, E. Liyanage, S. A. Rajaratne, K. P. A. P. Kariyawasam, A. A. J. Rajaratne

Abstract:

Maintaining or improving the muscle strength of the injured body part is essential to optimize performance among disabled athletes. General conditioning and strengthening exercises might be ineffective if not sufficiently intense enough or targeted for each participant’s specific impairment. Specific strengthening programme, targeted to the affected body part, are essential to improve the strength of impaired muscles and increase in strength will help reducing the impact of disability. Methods: The muscle strength of hip, knee and ankle joints was assessed in a group of randomly selected disabled athletes, using the Medical Research Council (MRC) grading. Those having muscle strength of grade 4 or less were selected for this study (24 in number) and were given and a custom made exercise program designed to strengthen their hip, knee or ankle joint musculature, according to the muscle or group of muscles affected. Effectiveness of the strengthening program was assessed after a period of 3 months. Results: Statistical analysis was done using the Minitab 16 statistical software. A Mann-Whitney U test was used to compare the strength of muscle group before and after exercise programme. A significant difference was observed after the three month strengthening program for knee flexors (Left and Right) (P =0.0889, 0.0312) hip flexors (left and right) (P=0.0312, 0.0466), hip extensors (Left and Right) (P=0.0478, 0.0513), ankle plantar flexors (Left and Right) (P=0.0466, 0.0423) and right ankle dorsiflexors (P= 0.0337). No significant difference of strength was observed after the strengthening program in the knee extensors (left and right), hip abductors (left and right) and left ankle dorsiflexors. Conclusion: Impairment specific exercise programme appear to be beneficial for disabled athletes to significantly improve the muscle strength of the affected joints.

Keywords: muscle strengthening programme, disabled athletes, physiotherapy, rehabilitation sciences

Procedia PDF Downloads 337
8121 Carbon Footprint Assessment Initiative and Trees: Role in Reducing Emissions

Authors: Omar Alelweet

Abstract:

Carbon emissions are quantified in terms of carbon dioxide equivalents, generated through a specific activity or accumulated throughout the life stages of a product or service. Given the growing concern about climate change and the role of carbon dioxide emissions in global warming, this initiative aims to create awareness and understanding of the impact of human activities and identify potential areas for improvement regarding the management of the carbon footprint on campus. Given that trees play a vital role in reducing carbon emissions by absorbing CO₂ during the photosynthesis process, this paper evaluated the contribution of each tree to reducing those emissions. Collecting data over an extended period of time is essential to monitoring carbon dioxide levels. This will help capture changes at different times and identify any patterns or trends in the data. By linking the data to specific activities, events, or environmental factors, it is possible to identify sources of emissions and areas where carbon dioxide levels are rising. Analyzing the collected data can provide valuable insights into ways to reduce emissions and mitigate the impact of climate change.

Keywords: sustainability, green building, environmental impact, CO₂

Procedia PDF Downloads 43
8120 Modeling and Optimization of Algae Oil Extraction Using Response Surface Methodology

Authors: I. F. Ejim, F. L. Kamen

Abstract:

Aims: In this experiment, algae oil extraction with a combination of n-hexane and ethanol was investigated. The effects of extraction solvent concentration, extraction time and temperature on the yield and quality of oil were studied using Response Surface Methodology (RSM). Experimental Design: Optimization of algae oil extraction using Box-Behnken design was used to generate 17 experimental runs in a three-factor-three-level design where oil yield, specific gravity, acid value and saponification value were evaluated as the response. Result: In this result, a minimum oil yield of 17% and maximum of 44% was realized. The optimum values for yield, specific gravity, acid value and saponification value from the overlay plot were 40.79%, 0.8788, 0.5056 mg KOH/g and 180.78 mg KOH/g respectively with desirability of 0.801. The maximum point prediction was yield 40.79% at solvent concentration 66.68 n-hexane, temperature of 40.0°C and extraction time of 4 hrs. Analysis of Variance (ANOVA) results showed that the linear and quadratic coefficient were all significant at p<0.05. The experiment was validated and results obtained were with the predicted values. Conclusion: Algae oil extraction was successfully optimized using RSM and its quality indicated it is suitable for many industrial uses.

Keywords: algae oil, response surface methodology, optimization, Box-Bohnken, extraction

Procedia PDF Downloads 315
8119 Spare Part Inventory Optimization Policy: A Study Literature

Authors: Zukhrof Romadhon, Nani Kurniati

Abstract:

Availability of Spare parts is critical to support maintenance tasks and the production system. Managing spare part inventory deals with some parameters and objective functions, as well as the tradeoff between inventory costs and spare parts availability. Several mathematical models and methods have been developed to optimize the spare part policy. Many researchers who proposed optimization models need to be considered to identify other potential models. This work presents a review of several pertinent literature on spare part inventory optimization and analyzes the gaps for future research. Initial investigation on scholars and many journal database systems under specific keywords related to spare parts found about 17K papers. Filtering was conducted based on five main aspects, i.e., replenishment policy, objective function, echelon network, lead time, model solving, and additional aspects of part classification. Future topics could be identified based on the number of papers that haven’t addressed specific aspects, including joint optimization of spare part inventory and maintenance.

Keywords: spare part, spare part inventory, inventory model, optimization, maintenance

Procedia PDF Downloads 39
8118 Cloud-Based Multiresolution Geodata Cube for Efficient Raster Data Visualization and Analysis

Authors: Lassi Lehto, Jaakko Kahkonen, Juha Oksanen, Tapani Sarjakoski

Abstract:

The use of raster-formatted data sets in geospatial analysis is increasing rapidly. At the same time, geographic data are being introduced into disciplines outside the traditional domain of geoinformatics, like climate change, intelligent transport, and immigration studies. These developments call for better methods to deliver raster geodata in an efficient and easy-to-use manner. Data cube technologies have traditionally been used in the geospatial domain for managing Earth Observation data sets that have strict requirements for effective handling of time series. The same approach and methodologies can also be applied in managing other types of geospatial data sets. A cloud service-based geodata cube, called GeoCubes Finland, has been developed to support online delivery and analysis of most important geospatial data sets with national coverage. The main target group of the service is the academic research institutes in the country. The most significant aspects of the GeoCubes data repository include the use of multiple resolution levels, cloud-optimized file structure, and a customized, flexible content access API. Input data sets are pre-processed while being ingested into the repository to bring them into a harmonized form in aspects like georeferencing, sampling resolutions, spatial subdivision, and value encoding. All the resolution levels are created using an appropriate generalization method, selected depending on the nature of the source data set. Multiple pre-processed resolutions enable new kinds of online analysis approaches to be introduced. Analysis processes based on interactive visual exploration can be effectively carried out, as the level of resolution most close to the visual scale can always be used. In the same way, statistical analysis can be carried out on resolution levels that best reflect the scale of the phenomenon being studied. Access times remain close to constant, independent of the scale applied in the application. The cloud service-based approach, applied in the GeoCubes Finland repository, enables analysis operations to be performed on the server platform, thus making high-performance computing facilities easily accessible. The developed GeoCubes API supports this kind of approach for online analysis. The use of cloud-optimized file structures in data storage enables the fast extraction of subareas. The access API allows for the use of vector-formatted administrative areas and user-defined polygons as definitions of subareas for data retrieval. Administrative areas of the country in four levels are available readily from the GeoCubes platform. In addition to direct delivery of raster data, the service also supports the so-called virtual file format, in which only a small text file is first downloaded. The text file contains links to the raster content on the service platform. The actual raster data is downloaded on demand, from the spatial area and resolution level required in each stage of the application. By the geodata cube approach, pre-harmonized geospatial data sets are made accessible to new categories of inexperienced users in an easy-to-use manner. At the same time, the multiresolution nature of the GeoCubes repository facilitates expert users to introduce new kinds of interactive online analysis operations.

Keywords: cloud service, geodata cube, multiresolution, raster geodata

Procedia PDF Downloads 117
8117 Business Intelligence Proposal to Improve Decision Making in Companies Using Google Cloud Platform and Microsoft Power BI

Authors: Joel Vilca Tarazona, Igor Aguilar-Alonso

Abstract:

The problem of this research related to business intelligence is the lack of a tool that supports automated and efficient financial analysis for decision-making and allows an evaluation of the financial statements, which is why the availability of the information is difficult. Relevant information to managers and users as an instrument in decision making financial, and administrative. For them, a business intelligence solution is proposed that will reduce information access time, personnel costs, and process automation, proposing a 4-layer architecture based on what was reviewed by the research methodology.

Keywords: decision making, business intelligence, Google Cloud, Microsoft Power BI

Procedia PDF Downloads 81
8116 The Right to a Fair Trial in French and Spanish Constitutional Law

Authors: Chloe Fauchon

Abstract:

In Europe, the right to a fair trial is enshrined in the European Convention on Human Rights, signed in 1950, in its famous Article 6, and, in the field of the European Union, in Article 47 of the Charter of Fundamental Rights, binding since 2009. The right to a fair trial is, therefore, a fundamental right protected by all the relevant treaties. The right to a fair trial is an "umbrella right" which encompasses various sub-rights and principles. Although this right applies in all the proceedings, it gets a special relevance in criminal matters and, particularly, regarding the defendant. In criminal proceedings, the parties are not equal: the accusation is represented by a State-organ, with specific prerogatives, and the defense does not benefit from these specific powers and is often inexperienced in criminal law. Equality of arms, and consequently the right to a fair trial, needs some specific mechanisms to be effective in criminal proceedings. For instance, the defendant benefits from some procedural rights, such as the right to a lawyer, the right to be informed of the charges against them, the right to confront witnesses, and so on. These rights aim to give the defendant the tools to dispute the accusation. The role of the defense is, therefore, very important in criminal matters to avoid unjustified convictions. This specificity of criminal matters justifies that the focus will be put on them during this study. Then this paper will also focus on French and Spanish legal orders. Indeed, if the European Court and Convention on Human Rights are the most famous instruments to protect the right to a fair trial, this right is also guaranteed at a constitutional level in European national legal orders in Europe. However, this enshrinement differs from one country to the other: for instance, in Spain, the right to a fair trial is protected explicitly by the 1978 constitutional text, whereas, in France, it is more of a case-law construction. Nevertheless, this difference between both legal orders does not imply huge variations in the substantive aspect of the right to a fair trial. This can be specifically explained by the submission of both States to the European Convention on Human Rights. This work aims to show that, although the French and Spanish legal orders differ in the way they protect the right to a fair trial, this right eventually has the same substantive meaning in both legal orders.

Keywords: right to a fair trial, constitutional law, French law, Spanish law, European Court of Human Rights

Procedia PDF Downloads 48
8115 Technology for Good: Deploying Artificial Intelligence to Analyze Participant Response to Anti-Trafficking Education

Authors: Ray Bryant

Abstract:

3Strands Global Foundation (3SGF), a non-profit with a mission to mobilize communities to combat human trafficking through prevention education and reintegration programs, launched a groundbreaking study that calls out the usage and benefits of artificial intelligence in the war against human trafficking. Having gathered more than 30,000 stories from counselors and school staff who have gone through its PROTECT Prevention Education program, 3SGF sought to develop a methodology to measure the effectiveness of the training, which helps educators and school staff identify physical signs and behaviors indicating a student is being victimized. The program further illustrates how to recognize and respond to trauma and teaches the steps to take to report human trafficking, as well as how to connect victims with the proper professionals. 3SGF partnered with Levity, a leader in no-code Artificial Intelligence (AI) automation, to create the research study utilizing natural language processing, a branch of artificial intelligence, to measure the effectiveness of their prevention education program. By applying the logic created for the study, the platform analyzed and categorized each story. If the story, directly from the educator, demonstrated one or more of the desired outcomes; Increased Awareness, Increased Knowledge, or Intended Behavior Change, a label was applied. The system then added a confidence level for each identified label. The study results were generated with a 99% confidence level. Preliminary results show that of the 30,000 stories gathered, it became overwhelmingly clear that a significant majority of the participants now have increased awareness of the issue, demonstrated better knowledge of how to help prevent the crime, and expressed an intention to change how they approach what they do daily. In addition, it was observed that approximately 30% of the stories involved comments by educators expressing they wish they’d had this knowledge sooner as they can think of many students they would have been able to help. Objectives Of Research: To solve the problem of needing to analyze and accurately categorize more than 30,000 data points of participant feedback in order to evaluate the success of a human trafficking prevention program by using AI and Natural Language Processing. Methodologies Used: In conjunction with our strategic partner, Levity, we have created our own NLP analysis engine specific to our problem. Contributions To Research: The intersection of AI and human rights and how to utilize technology to combat human trafficking.

Keywords: AI, technology, human trafficking, prevention

Procedia PDF Downloads 46
8114 Environmental Restoration Science in New York Harbor - Community Based Restoration Science Hubs, or “STEM Hubs”

Authors: Lauren B. Birney

Abstract:

The project utilizes the Billion Oyster Project (BOP-CCERS) place-based “restoration through education” model to promote computational thinking in NYC high school teachers and their students. Key learning standards such as Next Generation Science Standards and the NYC CS4All Equity and Excellence initiative are used to develop a computer science curriculum that connects students to their Harbor through hands-on activities based on BOP field science and educational programming. Project curriculum development is grounded in BOP-CCERS restoration science activities and data collection, which are enacted by students and educators at two Restoration Science STEM Hubs or conveyed through virtual materials. New York City Public School teachers with relevant experience are recruited as consultants to provide curriculum assessment and design feedback. The completed curriculum units are then conveyed to NYC high school teachers through professional learning events held at the Pace University campus and led by BOP educators. In addition, Pace University educators execute the Summer STEM Institute, an intensive two-week computational thinking camp centered on applying data analysis tools and methods to BOP-CCERS data. Both qualitative and quantitative analyses were performed throughout the five-year study. STEM+C – Community Based Restoration STEM Hubs. STEM Hubs are active scientific restoration sites capable of hosting school and community groups of all grade levels and professional scientists and researchers conducting long-term restoration ecology research. The STEM Hubs program has grown to include 14 STEM Hubs across all five boroughs of New York City and focuses on bringing in-field monitoring experience as well as coastal classroom experience to students. Restoration Science STEM Hubs activities resulted in: the recruitment of 11 public schools, 6 community groups, 12 teachers, and over 120 students receiving exposure to BOP activities. Field science protocols were designed exclusively around the use of the Oyster Restoration Station (ORS), a small-scale in situ experimental platforms which are suspended from a dock or pier. The ORS is intended to be used and “owned” by an individual school, teacher, class, or group of students, whereas the STEM Hub is explicitly designed as a collaborative space for large-scale community-driven restoration work and in-situ experiments. The ORS is also an essential tool in gathering Harbor data from disparate locations and instilling ownership of the research process amongst students. As such, it will continue to be used in that way. New and previously participating students will continue to deploy and monitor their own ORS, uploading data to the digital platform and conducting analysis of their own harbor-wide datasets. Programming the STEM Hub will necessitate establishing working relationships between schools and local research institutions. NYHF will provide introductions and the facilitation of initial workshops in school classrooms. However, once a particular STEM Hub has been established as a space for collaboration, each partner group, school, university, or CBO will schedule its own events at the site using the digital platform’s scheduling and registration tool. Monitoring of research collaborations will be accomplished through the platform’s research publication tool and has thus far provided valuable information on the projects’ trajectory, strategic plan, and pathway.

Keywords: environmental science, citizen science, STEM, technology

Procedia PDF Downloads 78
8113 Corporate Social Media: Understanding the Impact of Service Quality and Social Value on Customer Behavior

Authors: Regina Connolly, Murray Scott, William DeLone

Abstract:

Social media are revolutionary technologies that are transforming the way we communicate, the way we collaborate and the way we influence. Companies are making major investments in platforms such as Facebook and Twitter because they realize that social media are an influential force on customer perceptions and behavior. However, to date there is little guidance on what constitutes an effective deployment of social media and there is no empirical evidence that social medial investments are yielding positive returns. This research develops and validates the components of an effective corporate social media platform in order to examine the impact of effective social media on customer intentions and behavior.

Keywords: service quality, social value, social media, IS success, Web 2.0, customer behaviour

Procedia PDF Downloads 536
8112 Initiation of Paraptosis-Like PCD Pathway in Hepatocellular Carcinoma Cell Line by Hep88 mAb through the Binding of Mortalin (HSPA9) and Alpha-Enolase

Authors: Panadda Rojpibulstit, Suthathip Kittisenachai, Songchan Puthong, Sirikul Manochantr, Pornpen Gamnarai, Sasichai Kangsadalampai, Sittiruk Roytrakul

Abstract:

Hepatocellular carcinoma (HCC) is the most primary hepatic cancer worldwide. Nowadays a targeted therapy via monoclonal antibodies (mAbs) specific to tumor-associated antigen is continually developed in HCC treatment. In this regard, after establishing and consequently exploring Hep88 mAb’s tumoricidal effect on hepatocellular carcinoma cell line (HepG2 cell line), the Hep88 mAb’s specific Ag from both membrane and cytoplasmic fractions of HepG2 cell line was identified by 2-D gel electrophoresis and western blot analysis. After in-gel digestion and subsequent analysis by liquid chromatography-mass spectrometry (LC-MS), mortalin (HSPA9) and alpha-enolase were identified. The recombinant proteins specific to Hep88 mAb were cloned and expressed in E.coli BL21 (DE3). Moreover, alteration of HepG2 and Chang liver cell line after being induced by Hep88 mAb for 1-3 days was investigated using a transmission electron microscope. The result demonstrated that Hep88 mAb can bind to the recombinant mortalin (HSPA9) andalpha-enolase. In addition, gradual appearance of mitochondria vacuolization and endoplasmic reticulum dilatation were observed. Taken together, paraptosis-like programmed cell death (PCD) of HepG2 is induced by binding of mortalin (HSPA9) and alpha-enolase to Hep88 mAb. Mortalin depletion by formation of Hep88 mAb-mortalin (HSPA9) complex might initiate transcription-independent of p53-mediated apoptosis. Additionally, Hep88 mAb-alpha-enolase complex might initiate HepG2 cells energy exhaustion by glycolysis pathway obstruction. These results imply that Hep88 mAb might be a promising tool for development of an effective treatment of HCC in the next decade.

Keywords: Hepatocellular carcinoma, Monoclonal antibody, Paraptosis-like program cell death, Transmission electron microscopy, mortalin (HSPA9), alpha-enolase

Procedia PDF Downloads 346
8111 Genome Sequencing, Assembly and Annotation of Gelidium Pristoides from Kenton-on-Sea, South Africa

Authors: Sandisiwe Mangali, Graeme Bradley

Abstract:

Genome is complete set of the organism's hereditary information encoded as either deoxyribonucleic acid or ribonucleic acid in most viruses. The three different types of genomes are nuclear, mitochondrial and the plastid genome and their sequences which are uncovered by genome sequencing are known as an archive for all genetic information and enable researchers to understand the composition of a genome, regulation of gene expression and also provide information on how the whole genome works. These sequences enable researchers to explore the population structure, genetic variations, and recent demographic events in threatened species. Particularly, genome sequencing refers to a process of figuring out the exact arrangement of the basic nucleotide bases of a genome and the process through which all the afore-mentioned genomes are sequenced is referred to as whole or complete genome sequencing. Gelidium pristoides is South African endemic Rhodophyta species which has been harvested in the Eastern Cape since the 1950s for its high economic value which is one motivation for its sequencing. Its endemism further motivates its sequencing for conservation biology as endemic species are more vulnerable to anthropogenic activities endangering a species. As sequencing, mapping and annotating the Gelidium pristoides genome is the aim of this study. To accomplish this aim, the genomic DNA was extracted and quantified using the Nucleospin Plank Kit, Qubit 2.0 and Nanodrop. Thereafter, the Ion Plus Fragment Library was used for preparation of a 600bp library which was then sequenced through the Ion S5 sequencing platform for two runs. The produced reads were then quality-controlled and assembled through the SPAdes assembler with default parameters and the genome assembly was quality assessed through the QUAST software. From this assembly, the plastid and the mitochondrial genomes were then sampled out using Gelidiales organellar genomes as search queries and ordered according to them using the Geneious software. The Qubit and the Nanodrop instruments revealed an A260/A280 and A230/A260 values of 1.81 and 1.52 respectively. A total of 30792074 reads were obtained and produced a total of 94140 contigs with resulted into a sequence length of 217.06 Mbp with N50 value of 3072 bp and GC content of 41.72%. A total length of 179281bp and 25734 bp was obtained for plastid and mitochondrial respectively. Genomic data allows a clear understanding of the genomic constituent of an organism and is valuable as foundation information for studies of individual genes and resolving the evolutionary relationships between organisms including Rhodophytes and other seaweeds.

Keywords: Gelidium pristoides, genome, genome sequencing and assembly, Ion S5 sequencing platform

Procedia PDF Downloads 132
8110 Assessing Online Learning Paths in an Learning Management Systems Using a Data Mining and Machine Learning Approach

Authors: Alvaro Figueira, Bruno Cabral

Abstract:

Nowadays, students are used to be assessed through an online platform. Educators have stepped up from a period in which they endured the transition from paper to digital. The use of a diversified set of question types that range from quizzes to open questions is currently common in most university courses. In many courses, today, the evaluation methodology also fosters the students’ online participation in forums, the download, and upload of modified files, or even the participation in group activities. At the same time, new pedagogy theories that promote the active participation of students in the learning process, and the systematic use of problem-based learning, are being adopted using an eLearning system for that purpose. However, although there can be a lot of feedback from these activities to student’s, usually it is restricted to the assessments of online well-defined tasks. In this article, we propose an automatic system that informs students of abnormal deviations of a 'correct' learning path in the course. Our approach is based on the fact that by obtaining this information earlier in the semester, may provide students and educators an opportunity to resolve an eventual problem regarding the student’s current online actions towards the course. Our goal is to prevent situations that have a significant probability to lead to a poor grade and, eventually, to failing. In the major learning management systems (LMS) currently available, the interaction between the students and the system itself is registered in log files in the form of registers that mark beginning of actions performed by the user. Our proposed system uses that logged information to derive new one: the time each student spends on each activity, the time and order of the resources used by the student and, finally, the online resource usage pattern. Then, using the grades assigned to the students in previous years, we built a learning dataset that is used to feed a machine learning meta classifier. The produced classification model is then used to predict the grades a learning path is heading to, in the current year. Not only this approach serves the teacher, but also the student to receive automatic feedback on her current situation, having past years as a perspective. Our system can be applied to online courses that integrate the use of an online platform that stores user actions in a log file, and that has access to other student’s evaluations. The system is based on a data mining process on the log files and on a self-feedback machine learning algorithm that works paired with the Moodle LMS.

Keywords: data mining, e-learning, grade prediction, machine learning, student learning path

Procedia PDF Downloads 108
8109 Role of Calcination Treatment on the Structural Properties and Photocatalytic Activity of Nanorice N-Doped TiO₂ Catalyst

Authors: Totsaporn Suwannaruang, Kitirote Wantala

Abstract:

The purposes of this research were to synthesize titanium dioxide photocatalyst doped with nitrogen (N-doped TiO₂) by hydrothermal method and to test the photocatalytic degradation of paraquat under UV and visible light illumination. The effect of calcination treatment temperature on their physical and chemical properties and photocatalytic efficiencies were also investigated. The characterizations of calcined N-doped TiO₂ photocatalysts such as specific surface area, textural properties, bandgap energy, surface morphology, crystallinity, phase structure, elements and state of charges were investigated by Brunauer, Emmett, Teller (BET) and Barrett, Joyner, Halenda (BJH) equations, UV-Visible diffuse reflectance spectroscopy (UV-Vis-DRS) by using the Kubelka-Munk theory, Wide-angle X-ray scattering (WAXS), Focussed ion beam scanning electron microscopy (FIB-SEM), X-ray photoelectron spectroscopy (XPS) and X-ray absorption spectroscopy (XAS), respectively. The results showed that the effect of calcination temperature was significant on surface morphology, crystallinity, specific surface area, pore size diameter, bandgap energy and nitrogen content level, but insignificant on phase structure and oxidation state of titanium (Ti) atom. The N-doped TiO₂ samples illustrated only anatase crystalline phase due to nitrogen dopant in TiO₂ restrained the phase transformation from anatase to rutile. The samples presented the nanorice-like morphology. The expansion on the particle was found at 650 and 700°C of calcination temperature, resulting in increased pore size diameter. The bandgap energy was determined by Kubelka-Munk theory to be in the range 3.07-3.18 eV, which appeared slightly lower than anatase standard (3.20 eV), resulting in the nitrogen dopant could modify the optical absorption edge of TiO₂ from UV to visible light region. The nitrogen content was observed at 100, 300 and 400°C only. Also, the nitrogen element disappeared at 500°C onwards. The nitrogen (N) atom can be incorporated in TiO₂ structure with the interstitial site. The uncalcined (100°C) sample displayed the highest percent paraquat degradation under UV and visible light irradiation due to this sample revealed both the highest specific surface area and nitrogen content level. Moreover, percent paraquat removal significantly decreased with increasing calcination treatment temperature. The nitrogen content level in TiO₂ accelerated the rate of reaction with combining the effect of the specific surface area that generated the electrons and holes during illuminated with light. Therefore, the specific surface area and nitrogen content level demonstrated the important roles in the photocatalytic activity of paraquat under UV and visible light illumination.

Keywords: restraining phase transformation, interstitial site, chemical charge state, photocatalysis, paraquat degradation

Procedia PDF Downloads 137
8108 Presenting an Integrated Framework for the Introduction and Evaluation of Social Media in Enterprises

Authors: Gerhard Peter

Abstract:

In this paper, we present an integrated framework that governs the introduction of social media into enterprises and its evaluation. It is argued that the framework should address the following issues: (1) the contribution of social media for increasing efficiency and improving the quality of working life; (2) the level on which this contribution happens (i.e., individual, team, or organisation); (3) a description of the processes for implementing and evaluating social media; and the role of (4) organisational culture and (5) management. We also report the results of a case study where the framework has been employed to introduce a social networking platform at a German enterprise. This paper only considers the internal use of social media.

Keywords: case study, enterprise 2.0, framework, introducing and evaluating social media, social media

Procedia PDF Downloads 343
8107 Magnetic Solid-Phase Separation of Uranium from Aqueous Solution Using High Capacity Diethylenetriamine Tethered Magnetic Adsorbents

Authors: Amesh P, Suneesh A S, Venkatesan K A

Abstract:

The magnetic solid-phase extraction is a relatively new method among the other solid-phase extraction techniques for the separating of metal ions from aqueous solutions, such as mine water and groundwater, contaminated wastes, etc. However, the bare magnetic particles (Fe3O4) exhibit poor selectivity due to the absence of target-specific functional groups for sequestering the metal ions. The selectivity of these magnetic particles can be remarkably improved by covalently tethering the task-specific ligands on magnetic surfaces. The magnetic particles offer a number of advantages such as quick phase separation aided by the external magnetic field. As a result, the solid adsorbent can be prepared with the particle size ranging from a few micrometers to the nanometer, which again offers the advantages such as enhanced kinetics of extraction, higher extraction capacity, etc. Conventionally, the magnetite (Fe3O4) particles were prepared by the hydrolysis and co-precipitation of ferrous and ferric salts in aqueous ammonia solution. Since the covalent linking of task-specific functionalities on Fe3O4 was difficult, and it is also susceptible to redox reaction in the presence of acid or alkali, it is necessary to modify the surface of Fe3O4 by silica coating. This silica coating is usually carried out by hydrolysis and condensation of tetraethyl orthosilicate over the surface of magnetite to yield a thin layer of silica-coated magnetite particles. Since the silica-coated magnetite particles amenable for further surface modification, it can be reacted with task-specific functional groups to obtain the functionalized magnetic particles. The surface area exhibited by such magnetic particles usually falls in the range of 50 to 150 m2.g-1, which offer advantage such as quick phase separation, as compared to the other solid-phase extraction systems. In addition, the magnetic (Fe3O4) particles covalently linked on mesoporous silica matrix (MCM-41) and task-specific ligands offer further advantages in terms of extraction kinetics, high stability, longer reusable cycles, and metal extraction capacity, due to the large surface area, ample porosity and enhanced number of functional groups per unit area on these adsorbents. In view of this, the present paper deals with the synthesis of uranium specific diethylenetriamine ligand (DETA) ligand anchored on silica-coated magnetite (Fe-DETA) as well as on magnetic mesoporous silica (MCM-Fe-DETA) and studies on the extraction of uranium from aqueous solution spiked with uranium to mimic the mine water or groundwater contaminated with uranium. The synthesized solid-phase adsorbents were characterized by FT-IR, Raman, TG-DTA, XRD, and SEM. The extraction behavior of uranium on the solid-phase was studied under several conditions like the effect of pH, initial concentration of uranium, rate of extraction and its variation with pH and initial concentration of uranium, effect of interference ions like CO32-, Na+, Fe+2, Ni+2, and Cr+3, etc. The maximum extraction capacity of 233 mg.g-1 was obtained for Fe-DETA, and a huge capacity of 1047 mg.g-1 was obtained for MCM-Fe-DETA. The mechanism of extraction, speciation of uranium, extraction studies, reusability, and the other results obtained in the present study suggests Fe-DETA and MCM-Fe-DETA are the potential candidates for the extraction of uranium from mine water, and groundwater.

Keywords: diethylenetriamine, magnetic mesoporous silica, magnetic solid-phase extraction, uranium extraction, wastewater treatment

Procedia PDF Downloads 144
8106 Celebrating Community Heritage through the People’s Collection Wales: A Case Study in the Development of Collecting Traditions and Engagement

Authors: Gruffydd E. Jones

Abstract:

The world’s largest collection of historical, cultural, and heritage material is unarchived and undocumented in the hands of the public. Not only does this material represent the missing collections in heritage sector archives today, but it is also the key to providing a diverse range of communities with the means to express their history in their own words and to celebrate their unique, personal heritage. The People’s Collection Wales (PCW) acts as a platform on which the heritage of Wales and her people can be collated and shared, at the heart of which is a thriving community engagement programme across a network of museums, archives, and libraries. By providing communities with the archival skillset commonly employed throughout the heritage sector, PCW enables local projects, societies, and individuals to express their understanding of local heritage with their own voices, empowering communities to embrace their diverse and complex identities around Wales. Drawing on key examples from the project’s history, this paper will demonstrate the successful way in which museums have been developed as hubs for community engagement where the public was at the heart of collection and documentation activities, informing collection and curatorial policies to benefit both the institute and its local community. This paper will also highlight how collections from marginalised, under-represented, and minority communities have been published and celebrated extensively around Wales, including adoption by the education system in classrooms today. Any activity within the heritage sector, whether of collection, preservation, digitisation, or accessibility, should be considerate of community engagement opportunities not only to remain relevant but in order to develop as community hubs, pivots around which local heritage is supported and preserved. Attention will be drawn to our digitisation workflow, which, through training and support from museums and libraries, has allowed the public not only to become involved but to actively lead the contemporary evolution of documentation strategies in Wales. This paper will demonstrate how the PCW online access archive is promoting museum collections, encouraging user interaction, and providing an invaluable platform on which a broader community can inform, preserve and celebrate their cultural heritage through their own archival material too. The continuing evolution of heritage engagement depends wholly on placing communities at the heart of the sector, recognising their wealth of cultural knowledge, and developing the archival skillset necessary for them to become archival practitioners of their own.

Keywords: social history, cultural heritage, community heritage, museums, archives, libraries, community engagement, oral history, community archives

Procedia PDF Downloads 73
8105 The Use of Language as a Cognitive Tool in French Immersion Teaching

Authors: Marie-Josée Morneau

Abstract:

A literacy-based approach, centred on the use of the language of instruction as a cognitive tool, can increase the L2 communication skills of French immersion students. Academic subject areas such as science and mathematics offer an authentic language learning context where students can become more proficient speakers while using specific vocabulary and language structures to learn, interact and communicate their reasoning, when provided the opportunities and guidance to do so. In this Canadian quasi-experimental study, the effects of teaching specific language elements during mathematic classes through literacy-based activities in Early French Immersion programming were compared between two Grade 7/8 groups: the experimental group, which received literacy-based teaching for a 6-week period, and the control group, which received regular teaching instruction. The results showed that the participants from the experimental group made more progress in their mathematical communication skills, which suggests that targeting L2 language as a cognitive tool can be beneficial to immersion learners who learn mathematic concepts and remind us that all L2 teachers are language teachers.

Keywords: mathematics, French immersion, literacy-based, oral communication, L2

Procedia PDF Downloads 62
8104 Predicting Student Performance Based on Coding Behavior in STEAMplug

Authors: Giovanni Gonzalez Araujo, Michael Kyrilov, Angelo Kyrilov

Abstract:

STEAMplug is a web-based innovative educational platform which makes teaching easier and learning more effective. It requires no setup, eliminating the barriers to entry, allowing students to focus on their learning throughreal-world development environments. The student-centric tools enable easy collaboration between peers and teachers. Analyzing user interactions with the system enables us to predict student performance and identify at-risk students, allowing early instructor intervention.

Keywords: plagiarism detection, identifying at-Risk Students, education technology, e-learning system, collaborative development, learning and teaching with technology

Procedia PDF Downloads 131
8103 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach

Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik

Abstract:

Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.

Keywords: center of pressure (CoP), method of developed statokinesigram trajectory (MDST), model of postural system behavior, retroreflective marker data

Procedia PDF Downloads 329
8102 KBASE Technological Framework - Requirements

Authors: Ivan Stanev, Maria Koleva

Abstract:

Automated software development issues are addressed in this paper. Layers and packages of a Common Platform for Automated Programming (CPAP) are defined based on Service Oriented Architecture, Cloud computing, Knowledge based automated software engineering (KBASE) and Method of automated programming. Tools of seven leading companies (AWS of Amazon, Azure of Microsoft, App Engine of Google, vCloud of VMWare, Bluemix of IBM, Helion of HP, OCPaaS of Oracle) are analyzed in the context of CPAP. Based on the results of the analysis CPAP requirements are formulated

Keywords: automated programming, cloud computing, knowledge based software engineering, service oriented architecture

Procedia PDF Downloads 283
8101 Aggregate Fluctuations and the Global Network of Input-Output Linkages

Authors: Alexander Hempfing

Abstract:

The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.

Keywords: economic integration, industrial organization, input-output economics, network economics, production networks

Procedia PDF Downloads 252