The search functionality is under construction.

Keyword Search Result

[Keyword] burst detection(2hit)

1-2hit
  • Detecting TV Program Highlight Scenes Using Twitter Data Classified by Twitter User Behavior and Evaluating It to Soccer Game TV Programs

    Tessai HAYAMA  

     
    PAPER-Datamining Technologies

      Pubricized:
    2018/01/19
      Vol:
    E101-D No:4
      Page(s):
    917-924

    This paper presents a novel TV event detection method for automatically generating TV program digests by using Twitter data. Previous studies of TV program digest generation based on Twitter data have developed TV event detection methods that analyze the frequency time series of tweets that users made while watching a given TV program; however, in most of the previous studies, differences in how Twitter is used, e.g., sharing information versus conversing, have not been taken into consideration. Since these different types of Twitter data are lumped together into one category, it is difficult to detect highlight scenes of TV programs and correctly extract their content from the Twitter data. Therefore, this paper presents a highlight scene detection method to automatically generate TV program digests for TV programs based on Twitter data classified by Twitter user behavior. To confirm the effectiveness of the proposed method, experiments using 49 soccer game TV programs were conducted.

  • A Novel Algorithm for Burst Detection in Wideband Networking Waveform of Software Defined Radio

    Muhammad ZEESHAN  Shoab KHAN  

     
    PAPER-Digital Signal Processing

      Vol:
    E98-A No:6
      Page(s):
    1225-1233

    The correct detection of the start of burst is very important in wideband networking radio operation as it directly affects the Time Division Multiple Access (TDMA) adaptive time slot algorithm. In this paper, we propose a robust Data Aided (DA) algorithm for burst detection in a hybrid CDMA/Adaptive TDMA based wideband networking waveform of a software defined radio. The proposed algorithm is based on a novel differentially modulated training sequence designed by using precoding sequence. The training sequence structure and precoding sequence are exploited in the calculation of proposed timing metric which is normalized by the signal energy. The precoding sequence is adequately designed for the timing metric to have a sharp peak. The algorithm shows excellent performance for multiuser scenario. It is shown through computer simulations that by increasing the active users from 1 to 8, the performance degradation is only about 1∼2dB. The proposed algorithm is compared with other algorithms and found to outperform them even in the presence of multipath fading effects. The proposed algorithm has been implemented on Field Programmable Gate Array (FPGA) platform for high data rate applications and it is shown that the results from hardware are identical to the simulation results.